home *** CD-ROM | disk | FTP | other *** search
Text File | 1993-11-01 | 163.2 KB | 3,287 lines |
-
- From owner-cypherpunks@toad.com Thu Jul 29 13:15:43 1993
- id <AA19932>; Thu, 29 Jul 1993 13:15:31 -0600
- Received: from toad.com by relay2.UU.NET with SMTP
- (5.61/UUNET-internet-primary) id AA19702; Thu, 29 Jul 93 14:54:48 -0400
- Received: by toad.com id AA18528; Thu, 29 Jul 93 11:43:43 PDT
- Received: by toad.com id AA18525; Thu, 29 Jul 93 11:43:37 PDT
- Return-Path: <hh@pmantis.berkeley.edu>
- Received: from pmantis.berkeley.edu ([128.32.182.94]) by toad.com id AA18521; T
- Received: by pmantis.berkeley.edu (AIX 3.2/UCB 5.64/4.03)
- id AA18858; Thu, 29 Jul 1993 11:43:29 -0700
- Date: Thu, 29 Jul 1993 11:43:29 -0700
- Message-Id: <9307291843.AA18858@pmantis.berkeley.edu>
- To: cypherpunks@toad.com
- From: nobody@pmantis.berkeley.edu
- Subject: Chaum's Dining Cryptographers [LONG] (was: Digital Money again)
- Remailed-By: Tommy the Tourist <tommy@out>
- Status: OR
-
- Someone suggested that Allan Bailey seek this paper on soda.berkeley.edu. It is
- not there, but it was posted to the list last December. Here it is again.
-
- Date: 11 Dec 1992 14:58:38 -0800
- From: nobody@pmantis.berkeley.edu
- Subject: Chaum's "The Dining Cryptographers Problem" (VERY LONG)
- To: cypherpunks@toad.com
- Message-id: <9212112258.AA09353@pmantis.berkeley.edu>
- Content-transfer-encoding: 7BIT
- Remailed-By: Tommy the Tourist <tommy@out>
-
- The following article is brought to you by the Information Liberation
- Front (ILF), a group dedicated to the timely distribution of important
- information.
-
- The ILF encourages you to use this article for educational purposes
- only and to seek out the original article. Minor spelling errors and
- slight alterations of formulas may have gotten past the OCR process.
-
- We apologize for the length, but feel this is one of the key articles
- in this area.
-
-
- J. Cryptology (1988) 1:65-75
-
- The Dining Cryptographers Problem:
-
- Unconditional Sender and Recipient Untraceability
-
- David Chaum
- Centre for Mathematics and Computer Science, Kruislan 413, 1098 SJ
- Amsterdam, The Netherlands
-
- Abstract. Keeping confidential who sends which messages, in a
- world where any physical transmission can be traced to its
- origin, seems impossible. The solution presented here is
- unconditionally or cryptographically secure, depending on whether
- it is based on one-time-use keys or on public keys, respectively.
- It can be adapted to address efficiently a wide variety of
- practical considerations.
-
- Key words. Untraceability, Unconditional Security, Pseudonymity.
-
- Introduction
-
- Three cryptographers are sitting down to dinner at their favorite
- three-star restaurant. Their waiter informs them that arrangements
- have been made with the maitre d'hotel for the bill to be paid
- anonymously. One of the cryptographers might be paying for the dinner,
- or it might have been NSA (U.S. National Security Agency). The three
- cryptographers respect each other's right to make an anonymous
- payment, but they wonder if NSA is paying. They resolve their
- uncertainty fairly by carrying out the following protocol:
-
- Each cryptographer flips an unbiased coin behind his menu, between
- him and the cryptographer on his right, so that only the two of them can
- see the outcome. Each cryptographer then states aloud whether the two
- coins he can see--the one he flipped and the one his left-hand neighbor
- flipped--fell on the same side or on different sides. If one of the
- cryptographers is the payer, he states the opposite of what he sees. An
- odd number of differences uttered at the table indicates that a
- cryptographer is paying; an even number indicates that NSA is paying
- (assuming that the dinner was paid for only once). Yet if a
- cryptographer is paying, neither of the other two learns anything from
- the utterances about which cryptographer it is.
-
- To see why the protocol is unconditionally secure if carried out
- faithfully, consider the dilemma of a cryptographer who is not the
- payer and wishes to find out which cryptographer is. (If NSA pays, there
- is no anonymity problem.) There are two cases. In case (1) the two
- coins he sees are the same, one of the other cryptographers said
- "different," and the other one said "same." If the hidden outcome was
- the same as the two outcomes he sees, the cryptographer who said
- "different" is the payer; if the outcome was different, the one who said
- "same" is the payer. But since the hidden coin is fair, both possibilities
- are equally likely. In case (2) the coins he sees are different; if both
- other cryptographers said "different," then the payer is closest to the
- coin that is the same as the hidden coin; if both said "same," then the
- payer is closest to the coin that differs from the hidden coin. Thus, in
- each subcase, a nonpaying cryptographer learns nothing about which of
- the other two is paying.
-
- The cryptographers become intrigued with the ability to make
- messages public untraceably. They devise a way to do this at the table
- for a statement of arbitrary length: the basic protocol is repeated over
- and over; when one cryptographer wishes to make a message public, he
- merely begins inverting his statements in those rounds corresponding
- to 1 's in a binary coded version of his message. If he notices that his
- message would collide with some other message, he may for example
- wait a number of rounds chosen at random from a suitable distribution
- before trying to transmit again.
-
- 1. Generalizing the Approach
-
- During dinner, the cryptographers also consider how any number of
- participants greater than one can carry out a version of the protocol.
- (With two participants, only nonparticipant listeners are unable to
- distinguish between the two potential senders.) Each participant has a
- secret key bit in common with, say, every other participant. Each
- participant outputs the sum, modulo two, of all the key bits he shares,
- and if he wishes to transmit, he inverts his output. If no participant
- transmits, the modulo two sum of the outputs must be zero, since every
- key bit enters exactly twice; if one participant transmits, the sum
- must be one. (In fact, any even number of transmitting participants
- yields zero, and any odd number yields one.) For j rounds, each
- participant could have a j-bit key in common with every other
- participant, and the ith bit of each such key would be used only in the
- ith round. Detected collision of messages leads to attempted
- retransmission as described above; undetected collision results only
- from an odd number of synchronized identical message segments.
- (Generalization to fields other than GF(2) is possible, but seems to
- offer little practical advantage.)
-
- Other generalizations are also considered during dinner. The
- underlying assumptions are first made explicit, including modeling
- key-sharing arrangements as graphs. Next, the model is illustrated
- with some simple examples. The potential for cooperations of
- participants to violate the security of others is then looked at. Finally,
- a proof of security based on systems of linear equations is given.
-
- 1.1. Model
-
- Each participant is assumed to have two kinds of secret: (a) the keys
- shared with other participants for each round; and (b) the inversion
- used in each round (i.e., a 1 if the participant inverts in that round and a
- 0 if not). Some or all of a participant's secrets may be given to other
- participants in various forms of collusion, discussion of which is
- postponed until Section 1.3. (For simplicity in exposition, the
- possibility of secrets being stolen is ignored throughout.)
-
- The remaining information about the system may be described as: (a)
- who shares keys with whom; and (b) what each participant outputs
- during each round (the modulo two sum of that participant's keys and
- inversion). This information need not be secret to ensure
- untraceability. If it is publicly known and agreed, it allows various
- extensions discussed in Sections 2.5 and 2.6. The sum of all the
- outputs will, of course, usually become known to all participants.
-
- In the terminology of graphs, each participant corresponds to a
- vertex and each key corresponds to an edge. An edge is incident on the
- vertices corresponding to the pair of participants that shares the
- corresponding key. From here on, the graph and dinner-table
- terminologies will be used interchangeably. Also, without loss of
- generality, it will be assumed that the graph is connected (i.e., that a
- path exists between every pair of vertices), since each connected
- component (i.e., each maximal connected subgraph) could be considered
- a separate untraceable-sender system.
-
- An anonymity set seen by a set of keys is the set of vertices in a
- connected component of the graph formed from the original graph by
- removing the edges concerned. Thus a set of keys sees one anonymity
- set for each connected partition induced by removing the keys. The main
- theorem of Section 1.4 is essentially that those having only the public
- information and a set of keys seeing some anonymity set can learn
- nothing about the members of that anonymity set except the overall
- parity of their inversions. Thus, for example, any two participants
- connected by at least one chain of keys unknown to an observer are both
- in the same anonymity set seen by the observer's keys, and the observer
- gains nothing that would help distinguish between their messages.
-
- 1.2. Some Examples
-
- A few simple consequences of the above model may be illustrative. The
- anonymity set seen by the empty set (i.e., by a nonparticipant observer)
- is the set of all vertices, since the graph is assumed connected and
- remains so after zero edges are removed. Also, the anonymity sets seen
- by the full set of edges are all singleton sets, since each vertex's
- inversion is just the sum of its output and the corresponding key bits.
-
- If all other participants cooperate fully against one, of course no
- protocol can keep that singleton's messages untraceable, since
- untraceability exists only among a set of possible actors, and if the set
- has only one member, its messages are traceable. For similar reasons,
- if a participant believes that some subset of other participants will
- fully cooperate against him, there is no need for him to have keys in
- common with them.
-
- A biconnected graph (i.e., a graph with at least two vertex-disjoint
- paths between every pair of vertices) has no cut-vertices (i.e., a single
- vertex whose removal partitions the graph into disjoint subgraphs). In
- such a graph, the set of edges incident on a vertex v sees (apart from v)
- one anonymity set containing all other vertices, since there is a path
- not containing v between every pair of vertices, and thus they form a
- connected subgraph excluding v; each participant acting alone learns
- nothing about the contribution of other participants.
-
- 1.3. Collusion of Participants
-
- Some participants may cooperate by pooling their keys in efforts to
- trace the messages of others; such cooperation will be called collusion.
- For simplicity, the possibilities for multiple collusions or for pooling
- of information other than full edges will be ignored. Colluders who lie
- to each other are only touched on briefly, in Section 2.6.
-
- Consider collusion in a complete graph. A vertex is only seen as a
- singleton anonymity set by the collection of all edges incident on it; all
- other participants must supply the key they share with a participant in
- order to determine that participant's inversions. But since a collusion
- of all but one participant can always trace that participant merely by
- pooling its members' inversions as already mentioned, it gains nothing
- more by pooling its keys. The nonsingleton anonymity set seen by all
- edges incident on a colluding set of vertices in a complete graph is the
- set of all other vertices; again, a collusion yields nothing more from
- pooling all its keys than from pooling all its inversions.
-
- Now consider noncomplete graphs. A full collusion is a subset of
- participants pooling all of their keys. The pooled keys see each colluder
- as a singleton anonymity set; the colluders completely sacrifice the
- untraceability of their own messages. If a full collusion includes a cut-
- set of vertices (i.e., one whose removal partitions the graph), the
- collusion becomes nontrivial because it can learn something about the
- origin of messages originating outside the collusion; the noncolluding
- vertices are partitioned into disjoint subgraphs, which are the
- anonymity sets seen by the pooled keys.
-
- Members of a partial collusion pool some but not all of their keys.
- Unlike the members of a full collusion, each member of a partial
- collusion in general has a different set of keys. For it to be nontrivial,
- a partial collusion's pooled keys must include the bridges or separating
- edges of a segregation or splitting of the graph (i.e., those edges whose
- removal would partition the graph). Settings are easily constructed in
- which the pooled keys see anonymity sets that partition the graph and
- yet leave each colluder in a nonsingleton partition seen by any other
- participant. Thus, colluders can join a collusion without having to make
- themselves completely traceable to the collusion's other members.
-
- 1.4. Proof of Security
-
- Consider, without loss of generality, a single round in which say some
- full collusion knows some set of keys. Remove the edges known to the
- collusion from the key-sharing graph and consider any particular
- connected component C of the remaining graph. The vertices of C thus
- form an anonymity set seen by the pooled keys.
-
- Informally, what remains to be shown is that the only thing the
- collusion learns about the members of C is the parity sum of their
- inversions. This is intuitively apparent, since the inversions of the
- members of C are each in effect hidden from the collusion by one or
- more unknown key bits, and only the parity of the sum of these key bits
- is known (to be zero). Thus the inversions are hidden by a one-time pad,
- and only their parity is revealed, because only the parity of the pad is
- known.
-
- The setting is formalized as follows: the connected component C is
- comprised of rn vertices and n edges. The incidence matrix M of C is
- defined as usual, with the vertices labeling the rows and the edges
- labeling the columns. Let K, I, and A be stochastic variables defined on
- GF(2)^n, GF(2)^m, and GF(2)^m, respectively, such that
- K is uniformly distributed over GF(2)^n, K and I are mutually
- independent, and A = (MK) cross I. In terms of the protocol, K comprises
- the keys corresponding to the edges, I consists of the inversions
- corresponding to the vertices, and A is formed by the outputs of the
- vertices. Notice that the parity of A (i.e., the modulo two sum of its
- components) is always equal to the parity of I, since the columns of M
- each have zero parity. The desired result is essentially that A reveals
- no more information about I than the parity of 1. More formally:
-
- Theorem. Let a be in GF(2)^n. For each i in GF(2)^n, which is assumed by
- I with nonzero probability and which has the same parity as a, the
- conditional probability that A = a given that I = i is 2^(1 - m). Hence,
- the conditional probability that I = i given that A = a is the a priori
- probability that I = i.
-
- Proof. Let i be an element of GF(2)^n have the same parity as a.
- Consider the system of linear equations (MK) cross i = a, in k an
- element of GF(2)^n. Since the columns of M each have even parity, as
- mentioned above, its rows are linearly dependent over GF(2)^m. But as a
- consequence of the connectedness of the graph, every proper subset of
- rows of M is linearly independent. Thus, the rank of M is m - 1, and so
- each vector with zero parity can be written as a linear combination of
- the columns of M. This implies that the system is solvable because i
- cross a has even parity. Since the set of n column vectors of M has rank
- m - 1, the system has exactly 2^(n - m + 1) solutions.
-
- Together with the fact that K and I are mutually independent and
- that K is uniformly distributed, the theorem follows easily.
-
- 2. Some Practical Considerations
-
- After dinner, while discussing how they can continue to make
- untraceable statements from this respective homes, the cryptographers
- take up a variety of other topics. In particular, they consider different
- ways to establish the needed keys; debate adapting the approach to
- various kinds of communication networks; examine the traditional
- problems of secrecy and authentication in the context of a system that
- can provide essentially optimal untraceability; address denial of
- service caused by malicious and devious participants; and propose
- means to discourage socially undesirable messages from being sent.
-
- 2.1. Establishing Keys
-
- One way to provide the keys needed for longer messages is for one
- member of each pair to toss many coins in advance. Two identical
- copies of the resulting bits are made, say each on a separate optical
- disk. Supplying one such disk (which today can hold on the order of
- 10^10 bits) to a partner provides enough key bits to allow people to
- type messages at full speed for years. If participants are not
- transmitting all the time, the keys can be made to last even longer by
- using a substantially slower rate when no message is being sent; the
- full rate would be invoked automatically only when a 1 bit indicated
- the beginning of a message. (This can also reduce the bandwidth
- requirements discussed in Section 2.2.)
-
- Another possibility is for a pair to establish a short key and use a
- cryptographic pseudorandom-sequence generator to expand it as needed.
- Of course this system might be broken if the generator were broken.
- Cryptanalysis may be made more difficult, however, by lack of access
- to the output of individual generators. Even when the cryptographers do
- not exchange keys at dinner, they can safely do so later using a public-
- key distribution system (first proposed by [4] and [3]).
-
- 2.2 Underlying Communication Techniques
-
- A variety of underlying communication networks can be used, and their
- topology need not be related to that of the key-sharing graph.
-
- Communication systems based on simple cycles, called rings, are
- common in local area networks. In a typical ring, each node receives
- each bit and passes it round-robin to the next node. This technology is
- readily adapted to the present protocols. Consider a single-bit message
- like the "I paid" message originally sent at the dinner table. Each
- participant exclusive-or's the bit he receives with his own output
- before forwarding it to the next participant. When the bit has traveled
- full circle, it is the exclusive-or sum of all the participants' outputs,
- which is the desired result of the protocol. To provide these messages
- to all participants, each bit is sent around a second time by the
- participant at the end of the loop.
-
- Such an adapted ring requires, on average, a fourfold increase in
- bandwidth over the obvious traceable protocols in which messages
- travel only halfway around on average before being taken off the ring by
- their recipients. Rings differ from the dinner table in that several bit-
- transmission delays may be required before all the outputs of a
- particular round are known to all participants; collisions are detected
- only after such delays.
-
- Efficient use of many other practical communication techniques
- requires participants to group output bits into blocks. For example, in
- high-capacity broadcast systems, such as those based on coaxial cable,
- surface radio, or satellites, more efficient use of channel capacity is
- obtained by grouping a participant's contribution into a block about the
- size of a single message (see, e.g., [5]). Use of such communication
- techniques could require an increase in bandwidth on the order of the
- number of participants.
-
- In a network with one message per block, the well-known contention
- protocols can be used: time is divided evenly into frames; a participant
- transmits a block during one frame; if the block was garbled by
- collision (presumably with another transmitted block), the participant
- waits a number of frames chosen at random from some distribution
- before attempting to retransmit; the participants' waiting intervals
- may be adjusted on the basis of the collision rate and possibly of other
- heuristics [5].
-
- In a network with many messages per block, a first block may be
- used by various anonymous senders to request a "slot reservation" in a
- second block. A simple scheme would be for each anonymous sender to
- invert one randomly selected bit in the first block for each slot they
- wish to reserve in the second block. After the result of the first block
- becomes known, the participant who caused the ith 1 bit in the first
- block sends in the ith slot of the second block.
-
- 2.3. Example Key-Sharing Graphs
-
- In large systems it may be desirable to use fewer than the m(m - 1)/2
- keys required by a complete graph. If the graph is merely a cycle, then
- individuals acting alone learn nothing, but any two colluders can
- partition the graph, perhaps fully compromising a participant
- immediately between them. Such a topology might nevertheless be
- adequate in an application in which nearby participants are not likely
- to collude against one another.
-
- A different topology assumes the existence of a subset of
- participants who each participant believes are sufficiently unlikely to
- collude, such as participants with conflicting interests. This subset
- constitutes a fully connected subgraph, and the other participants each
- share a key with every member of it. Every participant is then
- untraceable among all the others, unless all members of the completely
- connected subset cooperate. (Such a situation is mentioned again in
- Section 3.)
-
- If many people wish to participate in an untraceable communication
- system, hierarchical arrangements may offer further economy of keys.
- Consider an example in which a representative from each local fully
- connected subgraph is also a member of the fully connected central
- subgraph. The nonrepresentative members of a local subgraph provide
- the sum of their outputs to their representative. Representatives would
- then add their own contributions before providing the sum to the
- central subgraph. Only a local subgraph's representative, or a collusion
- of representatives from all other local subgraphs, can recognize
- messages as coming from the local subgraph. A collusion comprising
- the representative and all but one nonrepresentative member of a local
- subgraph is needed for messages to be recognized as coming from the
- remaining member.
-
- 2.4. Secrecy and Authentication
-
- What about the usual cryptologic problems of secrecy and
- authentication?
-
- A cryptographer can ensure the secrecy of an anonymous message by
- encrypting the message with the intended recipient's public key. (The
- message should include a hundred or so random bits to foil attempts to
- confirm a guess at its content [1].) The sender can even keep the
- identity of the intended recipient secret by leaving it to each recipient
- to try to decrypt every message. Alternatively, a prearranged prefix
- could be attached to each message so that the recipient need only
- decrypt messages with recognized prefixes. To keep even the
- multiplicity of a prefix's use from being revealed, a different prefix
- might be used each time. New prefixes could be agreed in advance,
- generated cryptographically as needed, or supplied in earlier messages.
-
- Authentication is also quite useful in systems without identification.
- Even though the messages are untraceable, they might still bear
- digital signatures corresponding to public-key "digital pseudonyms"
- [1]; only the untraceable owner of such a pseudonym would be able to
- sign subsequent messages with it. Secure payment protocols have
- elsewhere been proposed in which the payer and/or the payee might be
- untraceable [2]. Other protocols have been proposed that allow
- individuals known only by pseudonyms to transfer securely information
- about themselves between organizations [2]. All these systems require
- solutions to the sender untraceability problem, such as the solution
- presented here, if they are to protect the unlinkability of pseudonyms
- used to conduct transactions from home.
-
- 2.5. Disruption
-
- Another question is how to stop participants who, accidentally or even
- intentionally, disrupt the system by preventing others from sending
- messages. In a sense, this problem has no solution, since any
- participant can send messages continuously, thereby clogging the
- channel. But nondisupters can ultimately stop disruption in a system
- meeting the following requirements: (1) the key-sharing graph is
- publicly agreed on; (2) each participant's outputs are publicly agreed on
- in such a way that participants cannot change their output for a round
- on the basis of other participants' outputs for that round; and (3) some
- rounds contain inversions that would not compromise the
- untraceability of any nondisrupter.
-
- The first requirement has already been mentioned in Section 1.1,
- where it was said that this information need not be secret; now it is
- required that this information actually be made known to all
- participants and that the participants agree on it.
-
- The second requirement is in part that disrupters be unable (at least
- with some significant probability) to change their output after hearing
- other participants' outputs. Some actual channels would automatically
- ensure this, such as broadcast systems in which all broadcasts are
- made simultaneously on different frequencies. The remainder of the
- second requirement, that the outputs be publicly agreed on, might also
- be met by broadcasting. Having only channels that do not provide it
- automatically, an effective way to meet the full second requirement
- would be for participants to "commit" to their outputs before making
- them. One way to do this is for participants to make public and agree on
- some (possibly compressing and hierarchical, see Section 2.6) one-way
- function of their outputs, before the outputs are made public.
-
- The third requirement is that at least some rounds can be contested
- (i.e., that all inversions can be made public) without compromising the
- untraceability of non-disrupting senders. The feasibility of this will be
- demonstrated here by a simple example protocol based on the slot
- reservation technique already described in Section 2.2.
-
- Suppose that each participant is always to make a single reservation
- in each reserving block, whether or not he actually intends to send a
- message. (Notice that, because of the "birthday paradox," the number of
- bits per reserving block must be quadratic in the number of
- participants.) A disrupted reserving block would then with very high
- probability have Hamming weight unequal to the number of participants.
- All bits of such a disrupted reserving block could be contested without
- loss of untraceability for nondisrupters.
-
- The reserved blocks can also be made to have such safely contestable
- bits if participants send trap messages. To lay a trap, a participant
- first chooses the index of a bit in some reserving block, a random
- message, and a secret key. Then the trapper makes public an
- encryption, using the secret key, of both the bit index and the random
- message. Later, the trapper reserves by inverting in the round
- corresponding to the bit index, and sends the random message in the
- resulting reserved slot. If a disrupter is unlucky enough to have
- damaged a trap message, then release of the secret key by the trapper
- would cause at least one bit of the reserved slot to be contested.
-
- With the three requirements satisfied, it remains to be shown how
- if enough disrupted rounds are contested, the disrupters will be
- excluded from the network.
-
- Consider first the case of a single participant's mail computer
- disrupting the network. If it tells the truth about contested key bits it
- shares (or lies about an even number of bits), the disrupter implicates
- itself, because its contribution to the sum is unequal to the sum of
- these bits (apart from any allowed inversion). If, on the other hand, the
- single disrupter lies about some odd number of shared bits, the values
- it claims will differ from those claimed for the same shared bits by
- the other participants sharing them. The disrupter thereby casts
- suspicion on all participants, including itself, that share the disputed
- bits. (It may be difficult for a disrupter to cast substantial suspicion
- on a large set of participants, since all the disputed bits will be in
- common with the disrupter.) Notice, however, that participants who
- have been falsely accused will know that they have been--and by
- whom--and should at least refuse to share bits with the disrupter in
- the future.
-
- Even with colluding multiple disrupters, at least one inversion must
- be revealed as illegitimate or at least one key bit disputed, since the
- parity of the outputs does not correspond to the number of legitimate
- inversions. The result of such a contested round will be the removal of
- at least one edge or at least one vertex from the agreed graph. Thus, if
- every disruptive action has a nonzero probability of being contested,
- only a bounded amount of disruption is possible before the disrupters
- share no keys with anyone in the network, or before they are revealed,
- and are in either case excluded from the network.
-
- The extension presented next can demonstrate the true value of
- disputed bits, and hence allows direct incrimination of disrupters.
-
- 2.6. Tracing by Consent
-
- Antisocial use of a network can be deterred if the cooperation of most
- participants makes it possible, albeit expensive, to trace any message.
- If, for example, a threatening message is sent, a court might order all
- participants to reveal their shared key bits for a round of the message.
- The sender of the offending message might try to spread the blame,
- however, by lying about some odd number of shared bits. Digital
- signatures can be used to stop such blame-spreading altogether. In
- principle, each party sharing a key could insist on a signature, made by
- the other party sharing, for the value. of each shared bit.
-
- Such signatures would allow for contested rounds to be fully resolved,
- for accused senders to exonerate themselves, and even for colluders to
- convince each other that they are pooling true keys. Unfortunately,
- cooperating participants able to trace a message to its sender could
- convince others of the message's origin by revealing the sender's own
- signatures. A variation can prevent a participant's signatures from
- being used against him in this way: instead of each member of a pair
- of participants signing the same shared key bit, each signs a separate
- bit, such that the sum of the signed bits is the actual shared key
- bit. Signatures on such "split" key bits would still be useful in
- resolving contested rounds, since if one contester of a bit shows a
- signature made by the second contester, then the second would have to
- reveal the corresponding signature made by the first or be thought to
- be a disrupter.
-
- In many applications it may be impractical to obtain a separate
- signature on every key bit or split key bit. The overhead involved could
- be greatly reduced, however, by digitally signing cryptographic
- compressions of large numbers of key bits. This might of course require
- that a whole block of key bits be exposed in showing a signature, but
- such blocks could be padded with cryptographically generated
- pseudorandom (or truly random) bits, to allow the exposure of fewer
- bits per signature. The number of bits and amount of time required to
- verify a signature for a single bit can be reduced further by using a
- rooted tree in which each node is the one-way compression function of
- all its direct descendants; only a digital signature of each participant's
- root need be agreed on before use of the keys comprising the leaves.
-
- 3. Relation to Previous Work
-
- There is another multiparty-secure sender-untraceability protocol in
- the literature [1]. To facilitate comparison, it will be called a mix-net
- here, while the protocol of the present work is called a dc-net. The
- mix-net approach relies on the security of a true public-key system
- (and possibly also of a conventional cryptosystem), and is thus at best
- computationally secure; the dc-net approach can use unconditional
- secrecy channels to provide an unconditionally secure untraceable-
- sender system, or can use public-key distribution to provide a
- computationally secure system (as described in Section 2.1).
-
- Under some trust assumptions and channel limitations, however,
- mix-nets can operate where dc-nets cannot. Suppose that a subset of
- participants is trusted by every other participant not to collude and
- that the bandwidth of at least some participants' channels to the
- trusted subset is incapable of handling the total message traffic. Then
- mix-nets may operate quite satisfactorily, but dc-nets will be unable
- to protect fully each participant's untraceability. Mix-nets can also
- provide recipient untraceability in this communication environment,
- even though there is insufficient bandwidth for use of the broadcast
- approach (mentioned in Section 2.4).
-
- If optimal protection against collusion is to be provided and the
- crypto-security of mix-nets is acceptable, a choice between mix-nets
- and dc-nets may depend on the nature of the traffic. With a mail-like
- system that requires only periodic deliveries, and where the average
- number of messages per interval is relatively large, mix-nets may be
- suitable. When messages must be delivered continually and there is no
- time for batching large numbers of them, dc-nets appear preferable.
-
- 4. Conclusion
-
- This solution to the dining cryptographers problem demonstrates that
- unconditional secrecy channels can be used to construct an
- unconditional sender-untraceability channel. It also shows that a
- public-key distribution system can be used to construct a
- computationally secure sender-untraceability channel. The approach
- appears able to satisfy a wide range of practical concerns.
-
- Acknowledgments
-
- I am pleased to thank Jurjen Bos, Gilles Brassard, Jan-Hendrik Evertse,
- and the untraceable referees for all their help in revising this article.
- It is also a pleasure to thank, as in the original version that was
- distributed at Crypto 84, Whitfield Diffie, Ron Rivest, and Gus Simmons
- for some stimulating dinner-table conversations.
-
- References
-
- [1] Chaum, D., Untraceable Electronic Mail, Return Addresses, and
- Digital Pseudonyms, Communications of the ACM, vol. 24, no. 2,
- February 1981, pp. 84-88.
- [2] Chaum, D., Security Without Identification: Transaction Systems
- to Make Big Brother Obsolete, Communications of the ACM, vol. 28,
- no. 10, October 1985, pp. 1030-1044.
- [3] Diffie, W., and Hellman, M.E., New Directions in Cryptography, IEEE
- Transactions on Information Theory, vol. 22, no. 6, November 1976,
- pp. 644-654.
- [4] Merkle, R.C., Secure Communication over Insecure Channels,
- Communications of the ACM, vol. 21, no. 4, 1978, pp. 294-299.
- [5] Tanenbaum, A.S., Computer Networks, Prentice Hall, Englewood
- Cliffs, New Jersey, 1981.
-
-
- [End of Transmission]
-
-
- Date: 03 Dec 1992 08:29:12 -0800 (PST)
- From: tcmay@netcom.com (Timothy C. May)
- Subject: RE: faq & glossary
- In-reply-to: <01GRUPQRO0N68WWDDT@delphi.com>; from "NORDEVALD@delphi.com" at
- Dec 2, 92 8:16 pm
- To: NORDEVALD@delphi.com
- Message-id: <9212031629.AA15785@netcom.netcom.com>
- Content-transfer-encoding: 7BIT
- X-Mailer: ELM [version 2.3 PL11]
-
-
-
- CRYPTO GLOSSARY
-
- Compiled by Tim May (tcmay@netcom.com) and Eric Hughes
- (hughes@soda.berkeley.edu), circa September 1992.
-
- Major Branches of Cryptology (as we see it)
-
- - (these sections will introduce the terms in context,
- though complete definitions will not be given)
-
- *** Encryption
- - privacy of messages
- - using ciphers and codes to protect the secrecy of
- messages
- - DES is the most common symmetric cipher (same key for
- encryption and decryption)
- - RSA is the most common asymmetric cipher (different
- keys for encryption and decryption)
-
- *** Signatures and Authentication
- - proving who you are
- - proving you signed a document (and not someone else)
-
- *** Untraceable Mail
- - untraceable sending and receiving of mail and messages
- - focus: defeating eavesdroppers and traffic analysis
- - DC protocol (dining cryptographers)
-
- *** Cryptographic Voting
- - focus: ballot box anonymity
- - credentials for voting
- - issues of double voting, security, robustness, efficiency
-
- *** Digital Cash
- - focus: privacy in transactions, purchases
- - unlinkable credentials
- - blinded notes
- - "digital coins" may not be possible
-
- *** Crypto Anarchy
- - using the above to evade govUt., to bypass tax collection,
- etc.
- - a technological solution to the problem of too much
- government
-
-
- *** G L O S S A R Y ***
-
-
- *** agoric systems -- open, free market systems in which
- voluntary transactions are central.
-
- *** Alice and Bob -- cryptographic protocols are often made
- clearer by considering parties A and B, or Alice and Bob,
- performing some protocol. Eve the eavesdropper, Paul the
- prover, and Vic the verifier are other common stand-in names.
-
- *** ANDOS -- all or nothing disclosure of secrets.
-
- *** anonymous credential -- a credential which asserts
- some right or privilege or fact without revealing the identity
- of the holder. This is unlike CA driver's licenses.
-
- *** asymmetric cipher -- same as public key
- cryptosystem.
-
- *** authentication -- the process of verifying an identity
- or credential, to ensure you are who you said you were.
-
- *** biometric security -- a type of authentication using
- fingerprints, retinal scans, palm prints, or other
- physical/biological signatures of an individual.
-
- *** bit commitment -- e.g., tossing a coin and then
- committing to the value without being able to change the
- outcome. The blob is a cryptographic primitive for this.
-
- *** blinding, blinded signatures -- A signature that the
- signer does not remember having made. A blind signature is
- always a cooperative protocol and the receiver of the
- signature provides the signer with the blinding information.
-
- *** blob -- the crypto equivalent of a locked box. A
- cryptographic primitive for bit commitment, with the
- properties that a blobs can represent a 0 or a 1, that others
- cannot tell be looking whether itUs a 0 or a 1, that the creator
- of the blob can "open" the blob to reveal the contents, and that
- no blob can be both a 1 and a 0. An example of this is a flipped
- coin covered by a hand.
-
- *** channel -- the path over which messages are
- transmitted. Channels may be secure or insecure, and may
- have eavesdroppers (or enemies, or disrupters, etc.) who alter
- messages, insert and delete messages, etc. Cryptography is
- the means by which communications over insecure channels
- are protected.
-
- *** chosen plaintext attack -- an attack where the
- cryptanalyst gets to choose the plaintext to be enciphered,
- e.g., when possession of an enciphering machine or algorithm
- is in the possession of the cryptanalyst.
-
- *** cipher -- a secret form of writing, using substitution or
- transposition of characters or symbols.
-
- *** ciphertext -- the plaintext after it has been encrypted.
-
- *** code -- a restricted cryptosystem where words or
- letters of a message are replaced by other words chosen from
- a codebook. Not part of modern cryptology, but still useful.
-
- *** coin flipping -- an important crypto primitive, or
- protocol, in which the equivalent of flipping a fair coin is
- possible. Implemented with blobs.
-
- *** collusion -- wherein several participants cooperate to
- deduce the identity of a sender or receiver, or to break a
- cipher. Most cryptosystems are sensitive to some forms of
- collusion. Much of the work on implementing DC Nets, for
- example, involves ensuring that colluders cannot isolate
- message senders and thereby trace origins and destinations
- of mail.
-
- *** computationally secure -- where a cipher cannot be
- broken with available computer resources, but in theory can
- be broken with enough computer resources. Contrast with
- unconditionally secure.
-
- *** countermeasure -- something you do to thwart an
- attacker.
-
- *** credential -- facts or assertions about some entity. For
- example, credit ratings, passports, reputations, tax status,
- insurance records, etc. Under the current system, these
- credentials are increasingly being cross-linked. Blind
- signatures may be used to create anonymous credentials.
-
- *** credential clearinghouse -- banks, credit agencies,
- insurance companies, police departments, etc., that correlate
- records and decide the status of records.
-
- *** cryptanalysis -- methods for attacking and breaking
- ciphers and related cryptographic systems. Ciphers may be
- broken, traffic may be analyzed, and passwords may be
- cracked. Computers are of course essential.
-
- *** crypto anarchy -- the economic and political system
- after the deployment of encryption, untraceable e-mail,
- digital pseudonyms, cryptographic voting, and digital cash. A
- pun on "crypto," meaning "hidden," and as when Gore Vidal
- called William F. Buckley a "crypto fascist."
-
- *** cryptography -- another name for cryptology.
-
- *** cryptology -- the science and study of writing, sending,
- receiving, and deciphering secret messages. Includes
- authentication, digital signatures, the hiding of messages
- (steganography), cryptanalysis, and several other fields.
-
- *** cyberspace -- the electronic domain, the Nets, and
- computer-generated spaces. Some say it is the "consensual
- reality" described in "Neuromancer." Others say it is the phone
- system. Others have work to do.
-
- *** DC protocol, or DC-Net -- the dining cryptographers
- protocol. DC-Nets use multiple participants communicating
- with the DC protocol.
-
- *** DES -- the Data Encryption Standard, proposed in
- 1977 by the National Bureau of Standards (now NIST), with
- assistance from the National Security Agency. Based on the
- "Lucifer" cipher developed by Horst Feistel at IBM, DES is a
- secret key cryptosystem that cycles 64-bit blocks of data
- through multiple permutations with a 56-bit key controlling
- the routing. "Diffusion" and "confusion" are combined to form
- a cipher that has not yet been cryptanalyzed (see "DES,
- Security of"). DES is in use for interbank transfers, as a
- cipher inside of several RSA-based systems, and is available
- for PCs.
-
- *** DES, Security of -- many have speculated that the NSA
- placed a trapdoor (or back door) in DES to allow it to read
- DES-encrypted messages. This has not been proved. It is
- known that the original Lucifer algorithm used a 128-bit key
- and that this key length was shortened to 64 bits (56 bits
- plus 8 parity bits), thus making exhaustive search much
- easier (so far as is known, brute-force search has not been
- done, though it should be feasible today). Shamir and Bihan
- have used a technique called "differential cryptanalysis" to
- reduce the exhaustive search needed for chosen plaintext
- attacks (but with no import for ordinary DES).
-
- *** differential cryptanalysis -- the Shamir-Biham
- technique for cryptanalyzing DES. With a chosen plaintext
- attack, they've reduced the number of DES keys that must be
- tried from about 2^56 to about 2^47 or less. Note, however,
- that rarely can an attacker mount a chosen plaintext attack
- on DES systems.
-
- *** digital cash, digital money -- Protocols for
- transferring value, monetary or otherwise, electronically.
- Digital cash usually refers to systems that are anonymous.
- Digital money systems can be used to implement any quantity
- that is conserved, such as points, mass, dollars, etc. There
- are many variations of digital money systems, ranging from
- VISA numbers to blinded signed digital coins. A topic too
- large for a single glossary entry.
-
- *** digital pseudonym -- basically, a "crypto identity." A
- way for individuals to set up accounts with various
- organizations without revealing more information than they
- wish. Users may have several digital pseudonyms, some used
- only once, some used over the course of many years. Ideally,
- the pseudonyms can be linked only at the will of the holder. In
- the simplest form, a public key can serve as a digital
- pseudonym and need not be linked to a physical identity.
-
- *** digital signature -- Analogous to a written signature
- on a document. A modification to a message that only the
- signer can make but that everyone can recognize. Can be used
- legally to contract at a distance.
-
- *** digital timestamping -- one function of a digital
- notary public, in which some message (a song, screenplay, lab
- notebook, contract, etc.) is stamped with a time that cannot
- (easily) be forged.
-
- *** dining cryptographers protocol (aka DC protocol,
- DC nets) -- the untraceable message sending system
- invented by David Chaum. Named after the "dining
- philosophers" problem in computer science, participants form
- circuits and pass messages in such a way that the origin
- cannot be deduced, barring collusion. At the simplest level,
- two participants share a key between them. One of them
- sends some actual message by bitwise exclusive-ORing the
- message with the key, while the other one just sends the key
- itself. The actual message from this pair of participants is
- obtained by XORing the two outputs. However, since nobody
- but the pair knows the original key, the actual message
- cannot be traced to either one of the participants.
-
- *** discrete logarithm problem -- given integers a, n,
- and x, find some integer m such that a^m mod n = x, if m
- exists. Modular exponentiation, the a^m mod n part, is
- straightforward (and special purpose chips are available), but
- the inverse problem is believed to be very hard, in general.
- Thus it is conjectured that modular exponentiation is a one-
- way function.
-
- *** DSS, Digital Signature Standard -- the latest NIST
- (National Institute of Standards and Technology, successor to
- NBS) standard for digital signatures. Based on the El Gamal
- cipher, some consider it weak and poor substitute for RSA-
- based signature schemes.
-
- *** eavesdropping, or passive wiretapping --
- intercepting messages without detection. Radio waves may be
- intercepted, phone lines may be tapped, and computers may
- have RF emissions detected. Even fiber optic lines can be
- tapped.
-
- *** factoring -- Some large numbers are difficult to factor.
- It is conjectured that there are no feasible--i.e."easy," less
- than exponential in size of number-- factoring methods. It is
- also an open problem whether RSA may be broken more easily
- than by factoring the modulus (e.g., the public key might
- reveal information which simplifies the problem).
- Interestingly, though factoring is believed to be "hard", it is
- not known to be in the class of NP-hard problems. Professor
- Janek invented a factoring device, but he is believed to be
- fictional.
-
- *** information-theoretic security -- "unbreakable"
- security, in which no amount of cryptanalysis can break a
- cipher or system. One time pads are an example (providing the
- pads are not lost nor stolen nor used more than once, of
- course). Same as unconditionally secure.
-
- *** key -- a piece of information needed to encipher or
- decipher a message. Keys may be stolen, bought, lost, etc.,
- just as with physical keys.
-
- *** key exchange, or key distribution -- the process of
- sharing a key with some other party, in the case of symmetric
- ciphers, or of distributing a public key in an asymmetric
- cipher. A major issue is that the keys be exchanged reliably
- and without compromise. Diffie and Hellman devised one such
- scheme, based on the discrete logarithm problem.
-
- *** known-plaintext attack -- a cryptanalysis of a cipher
- where plaintext-ciphertext pairs are known. This attack
- searches for an unknown key. Contrast with the chosen
- plaintext attack, where the cryptanalyst can also choose the
- plaintext to be enciphered.
-
- *** mail, untraceable -- a system for sending and
- receiving mail without traceability or observability.
- Receiving mail anonymously can be done with broadcast of the
- mail in encrypted form. Only the intended recipient (whose
- identity, or true name, may be unknown to the sender) may
- able to decipher the message. Sending mail anonymously
- apparently requires mixes or use of the dining cryptographers
- (DC) protocol.
-
- *** minimum disclosure proofs -- another name for zero
- knowledge proofs, favored by Chaum.
-
- *** mixes -- David Chaum's term for a box which performs
- the function of mixing, or decorrelating, incoming and
- outgoing electronic mail messages. The box also strips off
- the outer envelope (i.e., decrypts with its private key) and
- remails the message to the address on the inner envelope.
- Tamper-resistant modules may be used to prevent cheating
- and forced disclosure of the mapping between incoming and
- outgoing mail. A sequence of many remailings effectively
- makes tracing sending and receiving impossible. Contrast this
- with the software version, the DC protocol.
-
- *** modular exponentiation -- raising an integer to the
- power of another integer, modulo some integer. For integers
- a, n, and m, a^m mod n. For example, 5^3 mod 100 = 25. Modular
- exponentiation can be done fairly quickly with a sequence of
- bit shifts and adds, and special purpose chips have been
- designed. See also discrete logarithm.
-
- *** National Security Agency (NSA) -- the largest
- intelligence agency, responsible for making and breaking
- ciphers, for intercepting communications, and for ensuring
- the security of U.S. computers. Headquartered in Fort Meade,
- Maryland, with many listening posts around the world. The
- NSA funds cryptographic research and advises other agencies
- about cryptographic matters. The NSA once obviously had the
- world's leading cryptologists, but this may no longer be the
- case.
-
- *** negative credential -- a credential that you possess
- that you don't want any one else to know, for example, a
- bankruptcy filing. A formal version of a negative reputation.
-
- *** NP-complete -- a large class of difficult problems.
- "NP" stands for nondeterministic polynomial time, a class of
- problems thought in general not to have feasible algorithms
- for their solution. A problem is "complete" if any other NP
- problem may be reduced to that problem. Many important
- combinatorial and algebraic problems are NP-complete: the
- traveling salesman problem, the Hamiltonian cycle problem,
- the word problem, and on and on.
-
- *** oblivious transfer -- a cryptographic primitive that
- involves the probabilistic transmission of bits. The sender
- does not know if the bits were received.
-
- *** one-time pad -- a string of randomly-selected bits or
- symbols which is combined with a plaintext message to
- produce the ciphertext. This combination may be shifting
- letters some amount, bitwise exclusive-ORed, etc.). The
- recipient, who also has a copy of the one time pad, can easily
- recover the plaintext. Provided the pad is only used once and
- then destroyed, and is not available to an eavesdropper, the
- system is perfectly secure, i.e., it is information-
- theoretically secure. Key distribution (the pad) is obviously a
- practical concern, but consider CD-ROM's.
-
- *** one-way function -- a function which is easy to
- compute in one direction but hard to find any inverse for, e.g.
- modular exponentiation, where the inverse problem is known
- as the discrete logarithm problem. Compare the special case
- of trap door one-way functions. An example of a one-way
- operation is multiplication: it is easy to multiply two
- prime numbers of 100 digits to produce a 200-digit number,
- but hard to factor that 200-digit number.
-
- *** P ?=? NP -- Certainly the most important unsolved
- problem in complexity theory. If P = NP, then cryptography as
- we know it today does not exist. If P = NP, all NP problems
- are "easy."
-
- *** padding -- sending extra messages to confuse
- eavesdroppers and to defeat traffic analysis. Also adding
- random bits to a message to be enciphered.
-
- *** plaintext -- also called cleartext, the text that is to be
- enciphered.
-
- *** Pretty Good Privacy (PGP) -- Phillip ZimmermanUs
- implementation of RSA, recently upgraded to version 2.0,
- with more robust components and several new features. RSA
- Data Security has threatened PZ so he no longer works on it.
- Version 2.0 was written by a consortium of non-U.S. hackers.
-
- *** prime numbers -- integers with no factors other than
- themselves and 1. The number of primes is unbounded. About
- 1% of the 100 decimal digit numbers are prime. Since there
- are about 10^70 particles in the universe, there are about
- 10^23 100 digit primes for each and every particle in the
- universe!
-
- *** probabilistic encryption -- a scheme by Goldwasser,
- Micali, and Blum that allows multiple ciphertexts for the
- same plaintext, i.e., any given plaintext may have many
- ciphertexts if the ciphering is repeated. This protects against
- certain types of known ciphertext attacks on RSA.
-
- *** proofs of identity -- proving who you are, either your
- true name, or your digital identity. Generally, possession of
- the right key is sufficient proof (guard your key!). Some work
- has been done on "is-a-person" credentialling agencies, using
- the so-called Fiat-Shamir protocol...think of this as a way to
- issue unforgeable digital passports. Physical proof of identity
- may be done with biometric security methods. Zero knowledge
- proofs of identity reveal nothing beyond the fact that the
- identity is as claimed. This has obvious uses for computer
- access, passwords, etc.
-
- *** protocol -- a formal procedure for solving some
- problem. Modern cryptology is mostly about the study of
- protocols for many problems, such as coin-flipping, bit
- commitment (blobs), zero knowledge proofs, dining
- cryptographers, and so on.
-
- *** public key -- the key distributed publicly to potential
- message-senders. It may be published in a phonebook-like
- directory or otherwise sent. A major concern is the validity
- of this public key to guard against spoofing or impersonation.
-
- *** public key cryptosystem -- the modern breakthrough
- in cryptology, designed by Diffie and Hellman, with
- contributions from several others. Uses trap door one-way
- functions so that encryption may be done by anyone with
- access to the "public key" but decryption may be done only by
- the holder of the "private key." Encompasses public key
- encryption, digital signatures, digital cash, and many other
- protocols and applications.
-
- *** public key encryption -- the use of modern
- cryptologic methods to provided message security and
- authentication. The RSA algorithm is the most widely used
- form of public key encryption, although other systems exist.
- A public key may be freely published, e.g., in phonebook-like
- directories, while the corresponding private key is closely
- guarded.
-
- *** public key patents -- M.I.T. and Stanford, due to the
- work of Rivest, Shamir, Adleman, Diffie, Hellman, and Merkle,
- formed Public Key Partners to license the various public key,
- digital signature, and RSA patents. These patents, granted in
- the early 1980s, expire in the between 1998 and 2002. PKP
- has licensed RSA Data Security Inc., of Redwood City, CA,
- which handles the sales, etc.
-
- *** quantum cryptography -- a system based on quantum-
- mechanical principles. Eavesdroppers alter the quantum state
- of the system and so are detected. Developed by Brassard and
- Bennett, only small laboratory demonstrations have been
- made.
-
- *** reputations -- the trail of positive and negative
- associations and judgments that some entity accrues. Credit
- ratings, academic credentials, and trustworthiness are all
- examples. A digital pseudonym will accrue these reputation
- credentials based on actions, opinions of others, etc. In
- crypto anarchy, reputations and agoric systems will be of
- paramount importance. There are many fascinating issues of
- how reputation-based systems work, how credentials can be
- bought and sold, and so forth.
-
- *** RSA -- the main public key encryption algorithm,
- developed by Ron Rivest, Adi Shamir, and Kenneth Adleman. It
- exploits the difficulty of factoring large numbers to create a
- private key and public key. First invented in 1978, it remains
- the core of modern public key systems. It is usually much
- slower than DES, but special-purpose modular exponentiation
- chips will likely speed it up. A popular scheme for speed is to
- use RSA to transmit session keys and then a high-speed
- cipher like DES for the actual message text.
- *** Description -- Let p and q be large primes, typically
- with more than 100 digits. Let n = pq and find some e such that
- e is relatively prime to (p - 1)(q - 1). The set of numbers p, q,
- and e is the private key for RSA. The set of numbers n and e
- forms the public key (recall that knowing n is not sufficient to
- easily find p and q...the factoring problem). A message M is
- encrypted by computing M^e mod n. The owner of the private key
- can decrypt the encrypted message by exploiting number theory
- results, as follows. An integer d is computed such that ed = 1
- (mod (p - 1)(q - 1)). Euler proved a theorem that M^(ed) = M
- mod n and so M^(ed) mod n = M. This means that in some sense
- the integers e and d are "inverses" of each other. [If this
- is unclear, please see one of the many texts and articles on
- public key encryption.]
-
- *** secret key cryptosystem -- A system which uses the
- same key to encrypt and decrypt traffic at each end of a
- communication link. Also called a symmetric or one-key
- system. Contrast with public key cryptosystem.
-
- *** smart cards -- a computer chip embedded in credit
- card. They can hold cash, credentials, cryptographic keys,
- etc. Usually these are built with some degree of tamper-
- resistance. Smart cards may perform part of a crypto
- transaction, or all of it. Performing part of it may mean
- checking the computations of a more powerful computer, e.g.,
- one in an ATM.
-
- *** spoofing, or masquerading -- posing as another user.
- Used for stealing passwords, modifying files, and stealing
- cash. Digital signatures and other authentication methods are
- useful to prevent this. Public keys must be validated and
- protected to ensure that others don't substitute their own
- public keys which users may then unwittingly use.
-
- *** steganography -- a part of cryptology dealing with
- hiding messages and obscuring who is sending and receiving
- messages. Message traffic is often padded to reduce the
- signals that would otherwise come from a sudden beginning
- of messages.
-
- *** symmetric cipher -- same as private key
- cryptosystem.
-
- *** tamper-responding modules, tamper-resistant
- modules (TRMs) -- sealed boxes or modules which are hard
- to open, requiring extensive probing and usually leaving ample
- evidence that the tampering has occurred. Various protective
- techniques are used, such as special metal or oxide layers on
- chips, armored coatings, embedded optical fibers, and other
- measures to thwart analysis. Popularly called "tamper-proof
- boxes." Uses include: smart cards, nuclear weapon initiators,
- cryptographic key holders, ATMs, etc.
-
- *** tampering, or active wiretapping -- interfering with
- messages and possibly modifying them. This may compromise
- data security, help to break ciphers, etc. See also spoofing.
-
- *** token -- some representation, such as ID cards, subway
- tokens, money, etc., that indicates possession of some
- property or value.
-
- *** traffic analysis -- determining who is sending or
- receiving messages by analyzing packets, frequency of
- packets, etc. A part of steganography. Usually handled with
- traffic padding.
-
- *** transmission rules -- the protocols for determining
- who can send messages in a DC protocol, and when. These
- rules are needed to prevent collision and deliberate jamming
- of the channels.
-
- *** trap messages -- dummy messages in DC Nets which
- are used to catch jammers and disrupters. The messages
- contain no private information and are published in a blob
- beforehand so that the trap message can later be opened to
- reveal the disrupter. (There are many strategies to explore
- here.)
-
- *** trap-door -- In cryptography, a piece of secret
- information that allows the holder of a private key to invert a
- normally hard to invert function.
-
- *** trap-door one way functions -- functions which are
- easy to compute in both the forward and reverse direction but
- for which the disclosure of an algorithm to compute the
- function in the forward direction does not provide
- information on how to compute the function in the reverse
- direction. More simply put, trap-door one way functions are
- one way for all but the holder of the secret information. The
- RSA algorithm is the best-known example of such a function.
-
- *** unconditional security -- same as information-
- theoretic security, that is, unbreakable except by loss or
- theft of the key.
-
- *** unconditionally secure -- where no amount of
- intercepted ciphertext is enough to allow the cipher to be
- broken, as with the use of a one-time pad cipher. Contrast
- with computationally secure.
-
- *** voting, cryptographic -- Various schemes have been
- devised for anonymous, untraceable voting. Voting schemes
- should have several properties: privacy of the vote, security
- of the vote (no multiple votes), robustness against disruption
- by jammers or disrupters, verifiability (voter has confidence
- in the results), and efficiency.
-
- *** zero knowledge proofs -- proofs in which no
- knowledge of the actual proof is conveyed. Peggy the Prover
- demonstrates to Sid the Skeptic that she is indeed in
- possession of some piece of knowledge without actually
- revealing any of that knowledge. This is useful for access to
- computers, because eavesdroppers or dishonest sysops cannot
- steal the knowledge given. Also called minimum disclosure
- proofs. Useful for proving possession of some property, or
- credential, such as age or voting status, without revealing
- personal information.
-
- --
- ..........................................................................
- Timothy C. May | Crypto Anarchy: encryption, digital money,
- tcmay@netcom.com | anonymous networks, digital pseudonyms, zero
- 408-688-5409 | knowledge, reputations, information markets,
- W.A.S.T.E.: Aptos, CA | black markets, collapse of governments.
- Higher Power: 2^756839 | PGP Public Key: by arrangement.
-
-
-
- From owner-cypherpunks@toad.com Wed Aug 11 15:53:21 1993
- id <AA11099>; Wed, 11 Aug 1993 15:53:19 -0600
- Received: from toad.com by relay2.UU.NET with SMTP
- (5.61/UUNET-internet-primary) id AA19178; Wed, 11 Aug 93 17:45:05 -0400
- Received: by toad.com id AA14598; Wed, 11 Aug 93 14:32:06 PDT
- Received: by toad.com id AA14592; Wed, 11 Aug 93 14:31:45 PDT
- Return-Path: <washofc!dsobel@uu5.psi.com>
- Received: from uu5.psi.com ([38.145.226.3]) by toad.com id AA14588; Wed, 11 Aug
- Received: from washofc.UUCP by uu5.psi.com (5.65b/4.0.071791-PSI/PSINet) via UU
- id AA09856 for ; Wed, 11 Aug 93 14:13:17 -0400
- Received: from OFFICE (QM 2.5.1) by washofc.cpsr.org (UMCP\QM 2.0)
- id AA04658; Wed, 11 Aug 1993 14:17:12 EST
- Message-Id: <00541.2827923432.4658@washofc.cpsr.org>
- Organization: CPSR Washington Office
- X-Umcp-To: Cypherpunks
- From: David Sobel <dsobel@washofc.cpsr.org>
- To: Cypherpunks <cypherpunks@toad.com>
- Date: Wed, 11 Aug 1993 13:59:53 EST
- Subject: Clipper trapdoor?
- Status: OR
-
- Clipper trapdoor?
-
- Peter Wayner <pcw@access.digex.net> writes:
-
- >My general impression is that the system is secure. Many people
- >have played paranoid and expressed concerns that the classified
- >algorithm might be hiding a trapdoor. It became clear to me that
- >these concerns were really silly. There is a built-in trapdoor
- >to be used by the government when it is "legal authorized" to
- >intercept messages. The NSA has rarely had trouble in the past
- >exercising either its explicitly granted legal authority or
- >its implied authority. The phrase "national security" is a
- >powerful pass phrase around Washington and there is no reason
- >for me to believe that the NSA wouldn't get all of the access
- >to the escrow database that it needs to do its job. Building in
- >a backdoor would only leave a weakness for an opponent to exploit
- >and that is something that is almost as sacrilidgeous at the NSA
- >as just putting the classified secrets in a Fed Ex package to
- >Saddam Hussein.
-
- This raises an interesting question and I draw a totally different
- conclusion. If, as we have been told, the only way for an agency to
- obtain the escrow keys is to present a court order, than NSA needs
- to obtain such an order to decrypt *any* communication it intercepts.
-
- I don't really understand what Peter means when he says that "NSA has
- rarely had trouble in the past exercising either its explicitly granted
- legal authority or its implied authority. The phrase 'national security'
- is a powerful pass phrase around Washington and there is no reason
- for me to believe that the NSA wouldn't get all of the access
- to the escrow database that it needs to do its job."
-
- Does this mean NSA would, in fact, obtain a warrant in order to "get all
- of the access to the escrow database that it needs to do its job"? If so,
- this would represent an unprecedented change in the way NSA does "its
- job." NSA has no domestic law enforcement authority, so it would
- obviously never be in a position to obtain a law enforcement wiretap
- warrant under Title III. The only possible way for NSA to obtain a warrant
- would be under the Foreign Intelligence Surveillance Act (FISA). But the
- Foreign Intelligence Surveillance Court, which issues warrants under FISA,
- has ruled that FISA's provisions
-
- limit the authority to conduct electronic surveillances to the U.S.
- in a geographic sense as defined in sec. 101(i). The drafters left
- to another day the matter of "broadening this legislation to apply
- overseas ... because the problems and circumstances of overseas
- surveillance demand separate treatment."
-
- In the Matter of the Application of the United States for an Order
- Authorizing
- the Physical Search of Nonresidential Premises and Personal Property (1981),
- footnote 1 (citations omitted).
-
- Consider the following hypothetical: Iraqi agents smuggle Clipper phones
- out of the U.S. Saddam Hussein uses them to communicate with his military
- commander in Basra. NSA intercepts the communications. Question: How
- does NSA decrypt the messages?
-
- Note that neither Title III (law enforcement) nor FISA (U.S.-based) apply
- to this situation, so we have to assume that NSA will not have a court order
- to obtain the escrow keys. I have to conclude that NSA would not be putting
- this technology out into the world *unless* it did, in fact, have some way
- to
- decrypt messages *without* access to the escrow keys.
-
- Am I missing something?
-
-
- David Sobel
- CPSR Legal Counsel
-
-
-
-
- Article 18507 of sci.crypt:
- Path: lynx.unm.edu!fmsrl7!destroyer!sol.ctr.columbia.edu!math.ohio-state.edu!da
- From: denning@guvax.acc.georgetown.edu
- Newsgroups: sci.crypt
- Subject: SKIPJACK Review, Interim Report
- Message-ID: <1993Aug1.220927.4510@guvax.acc.georgetown.edu>
- Date: 1 Aug 93 22:09:27 -0400
- Distribution: world
- Organization: Georgetown University
- Lines: 369
-
- SKIPJACK Review
-
- Interim Report
-
- The SKIPJACK Algorithm
-
-
- Ernest F. Brickell, Sandia National Laboratories
- Dorothy E. Denning, Georgetown University
- Stephen T. Kent, BBN Communications Corporation
- David P. Maher, AT&T
- Walter Tuchman, Amperif Corporation
-
- July 28, 1993
-
- (copyright 1993)
-
-
- Executive Summary
-
- The objective of the SKIPJACK review was to provide a mechanism whereby
- persons outside the government could evaluate the strength of the
- classified encryption algorithm used in the escrowed encryption devices
- and publicly report their findings. Because SKIPJACK is but one
- component of a large, complex system, and because the security of
- communications encrypted with SKIPJACK depends on the security of the
- system as a whole, the review was extended to encompass other
- components of the system. The purpose of this Interim Report is to
- report on our evaluation of the SKIPJACK algorithm. A later Final
- Report will address the broader system issues.
-
- The results of our evaluation of the SKIPJACK algorithm are as
- follows:
-
- 1. Under an assumption that the cost of processing power is halved
- every eighteen months, it will be 36 years before the cost of
- breaking SKIPJACK by exhaustive search will be equal to the cost
- of breaking DES today. Thus, there is no significant risk that
- SKIPJACK will be broken by exhaustive search in the next 30-40
- years.
-
- 2. There is no significant risk that SKIPJACK can be broken through a
- shortcut method of attack.
-
- 3. While the internal structure of SKIPJACK must be classified in
- order to protect law enforcement and national security objectives,
- the strength of SKIPJACK against a cryptanalytic attack does not
- depend on the secrecy of the algorithm.
-
-
-
- 1. Background
-
- On April 16, the President announced a new technology initiative aimed
- at providing a high level of security for sensitive, unclassified
- communications, while enabling lawfully authorized intercepts of
- telecommunications by law enforcement officials for criminal
- investigations. The initiative includes several components:
-
- A classified encryption/decryption algorithm called "SKIPJACK."
-
- Tamper-resistant cryptographic devices (e.g., electronic chips),
- each of which contains SKIPJACK, classified control software, a
- device identification number, a family key used by law enforcement,
- and a device unique key that unlocks the session key used to
- encrypt a particular communication.
-
- A secure facility for generating device unique keys and programming
- the devices with the classified algorithms, identifiers, and keys.
-
- Two escrow agents that each hold a component of every device unique
- key. When combined, those two components form the device unique
- key.
-
- A law enforcement access field (LEAF), which enables an authorized
- law enforcement official to recover the session key. The LEAF is
- created by a device at the start of an encrypted communication and
- contains the session key encrypted under the device unique key
- together with the device identifier, all encrypted under the family
- key.
-
- LEAF decoders that allow an authorized law enforcement official to
- extract the device identifier and encrypted session key from an
- intercepted LEAF. The identifier is then sent to the escrow
- agents, who return the components of the corresponding device
- unique key. Once obtained, the components are used to reconstruct
- the device unique key, which is then used to decrypt the session
- key.
-
- This report reviews the security provided by the first component,
- namely the SKIPJACK algorithm. The review was performed pursuant to
- the President's direction that "respected experts from outside the
- government will be offered access to the confidential details of the
- algorithm to assess its capabilities and publicly report their
- finding." The Acting Director of the National Institute of Standards
- and Technology (NIST) sent letters of invitation to potential
- reviewers. The authors of this report accepted that invitation.
-
- We attended an initial meeting at the Institute for Defense Analyses
- Supercomputing Research Center (SRC) from June 21-23. At that meeting,
- the designer of SKIPJACK provided a complete, detailed description of
- the algorithm, the rationale for each feature, and the history of the
- design. The head of the NSA evaluation team described the evaluation
- process and its results. Other NSA staff briefed us on the LEAF
- structure and protocols for use, generation of device keys, protection
- of the devices against reverse engineering, and NSA's history in the
- design and evaluation of encryption methods contained in SKIPJACK.
- Additional NSA and NIST staff were present at the meeting to answer our
- questions and provide assistance. All staff members were forthcoming
- in providing us with requested information.
-
- At the June meeting, we agreed to integrate our individual evaluations
- into this joint report. We also agreed to reconvene at SRC from July
- 19-21 for further discussions and to complete a draft of the report.
- In the interim, we undertook independent tasks according to our
- individual interests and availability. Ernest Brickell specified a
- suite of tests for evaluating SKIPJACK. Dorothy Denning worked at NSA
- on the refinement and execution of these and other tests that took into
- account suggestions solicited from Professor Martin Hellman at Stanford
- University. NSA staff assisted with the programming and execution of
- these tests. Denning also analyzed the structure of SKIPJACK and its
- susceptibility to differential cryptanalysis. Stephen Kent visited NSA
- to explore in more detail how SKIPJACK compared with NSA encryption
- algorithms that he already knew and that were used to protect
- classified data. David Maher developed a risk assessment approach
- while continuing his ongoing work on the use of the encryption chip in
- the AT&T Telephone Security Device. Walter Tuchman investigated the
- anti-reverse engineering properties of the chips.
-
- We investigated more than just SKIPJACK because the security of
- communications encrypted with the escrowed encryption technology
- depends on the security provided by all the components of the
- initiative, including protection of the keys stored on the devices,
- protection of the key components stored with the escrow agents, the
- security provided by the LEAF and LEAF decoder, protection of keys
- after they have been transmitted to law enforcement under court order,
- and the resistance of the devices to reverse engineering. In addition,
- the success of the technology initiative depends on factors besides
- security, for example, performance of the chips. Because some
- components of the escrowed encryption system, particularly the key
- escrow system, are still under design, we decided to issue this Interim
- Report on the security of the SKIPJACK algorithm and to defer our Final
- Report until we could complete our evaluation of the system as a
- whole.
-
-
- 2. Overview of the SKIPJACK Algorithm
-
- SKIPJACK is a 64-bit "electronic codebook" algorithm that transforms a
- 64-bit input block into a 64-bit output block. The transformation is
- parameterized by an 80-bit key, and involves performing 32 steps or
- iterations of a complex, nonlinear function. The algorithm can be used
- in any one of the four operating modes defined in FIPS 81 for use with
- the Data Encryption Standard (DES).
-
- The SKIPJACK algorithm was developed by NSA and is classified SECRET.
- It is representative of a family of encryption algorithms developed in
- 1980 as part of the NSA suite of "Type I" algorithms, suitable for
- protecting all levels of classified data. The specific algorithm,
- SKIPJACK, is intended to be used with sensitive but unclassified
- information.
-
- The strength of any encryption algorithm depends on its ability to
- withstand an attack aimed at determining either the key or the
- unencrypted ("plaintext") communications. There are basically two
- types of attack, brute-force and shortcut.
-
-
- 3. Susceptibility to Brute Force Attack by Exhaustive Search
-
- In a brute-force attack (also called "exhaustive search"), the
- adversary essentially tries all possible keys until one is found that
- decrypts the intercepted communications into a known or meaningful
- plaintext message. The resources required to perform an exhaustive
- search depend on the length of the keys, since the number of possible
- keys is directly related to key length. In particular, a key of length
- N bits has 2^N possibilities. SKIPJACK uses 80-bit keys, which means
- there are 2^80 (approximately 10^24) or more than 1 trillion trillion
- possible keys.
-
- An implementation of SKIPJACK optimized for a single processor on the
- 8-processor Cray YMP performs about 89,000 encryptions per second. At
- that rate, it would take more than 400 billion years to try all keys.
- Assuming the use of all 8 processors and aggressive vectorization, the
- time would be reduced to about a billion years.
-
- A more speculative attack using a future, hypothetical, massively
- parallel machine with 100,000 RISC processors, each of which was
- capable of 100,000 encryptions per second, would still take about 4
- million years. The cost of such a machine might be on the order of $50
- million. In an even more speculative attack, a special purpose machine
- might be built using 1.2 billion $1 chips with a 1 GHz clock. If the
- algorithm could be pipelined so that one encryption step were performed
- per clock cycle, then the $1.2 billion machine could exhaust the key
- space in 1 year.
-
- Another way of looking at the problem is by comparing a brute force
- attack on SKIPJACK with one on DES, which uses 56-bit keys. Given that
- no one has demonstrated a capability for breaking DES, DES offers a
- reasonable benchmark. Since SKIPJACK keys are 24 bits longer than DES
- keys, there are 2^24 times more possibilities. Assuming that the cost
- of processing power is halved every eighteen months, then it will not
- be for another 24 * 1.5 = 36 years before the cost of breaking
- SKIPJACK is equal to the cost of breaking DES today. Given the lack of
- demonstrated capability for breaking DES, and the expectation that the
- situation will continue for at least several more years, one can
- reasonably expect that SKIPJACK will not be broken within the next
- 30-40 years.
-
- Conclusion 1: Under an assumption that the cost of processing power
- is halved every eighteen months, it will be 36 years before the cost of
- breaking SKIPJACK by exhaustive search will be equal to the cost of
- breaking DES today. Thus, there is no significant risk that SKIPJACK
- will be broken by exhaustive search in the next 30-40 years.
-
- 4. Susceptibility to Shortcut Attacks
-
- In a shortcut attack, the adversary exploits some property of the
- encryption algorithm that enables the key or plaintext to be determined
- in much less time than by exhaustive search. For example, the RSA
- public-key encryption method is attacked by factoring a public value
- that is the product of two secret primes into its primes.
-
- Most shortcut attacks use probabilistic or statistical methods that
- exploit a structural weakness, unintentional or intentional (i.e., a
- "trapdoor"), in the encryption algorithm. In order to determine
- whether such attacks are possible, it is necessary to thoroughly
- examine the structure of the algorithm and its statistical properties.
- In the time available for this review, it was not feasible to conduct
- an evaluation on the scale that NSA has conducted or that has been
- conducted on the DES. Such review would require many man-years of
- effort over a considerable time interval. Instead, we concentrated on
- reviewing NSA's design and evaluation process. In addition, we
- conducted several of our own tests.
-
- 4.1 NSA's Design and Evaluation Process
-
- SKIPJACK was designed using building blocks and techniques that date
- back more than forty years. Many of the techniques are related to work
- that was evaluated by some of the world's most accomplished and famous
- experts in combinatorics and abstract algebra. SKIPJACK's more
- immediate heritage dates to around 1980, and its initial design to
- 1987.
-
- SKIPJACK was designed to be evaluatable, and the design and evaluation
- approach was the same used with algorithms that protect the country's
- most sensitive classified information. The specific structures
- included in SKIPJACK have a long evaluation history, and the
- cryptographic properties of those structures had many prior years of
- intense study before the formal process began in 1987. Thus, an
- arsenal of tools and data was available. This arsenal was used by
- dozens of adversarial evaluators whose job was to break SKIPJACK. Many
- spent at least a full year working on the algorithm. Besides highly
- experienced evaluators, SKIPJACK was subjected to cryptanalysis by less
- experienced evaluators who were untainted by past approaches. All
- known methods of attacks were explored, including differential
- cryptanalysis. The goal was a design that did not allow a shortcut
- attack.
-
- The design underwent a sequence of iterations based on feedback from
- the evaluation process. These iterations eliminated properties which,
- even though they might not allow successful attack, were related to
- properties that could be indicative of vulnerabilities. The head of
- the NSA evaluation team confidently concluded "I believe that SKIPJACK
- can only be broken by brute force there is no better way."
-
- In summary, SKIPJACK is based on some of NSA's best technology.
- Considerable care went into its design and evaluation in accordance
- with the care given to algorithms that protect classified data.
-
- 4.2 Independent Analysis and Testing
-
- Our own analysis and testing increased our confidence in the strength
- of SKIPJACK and its resistance to attack.
-
- 4.2.1 Randomness and Correlation Tests
-
- A strong encryption algorithm will behave like a random function of the
- key and plaintext so that it is impossible to determine any of the key
- bits or plaintext bits from the ciphertext bits (except by exhaustive
- search). We ran two sets of tests aimed at determining whether
- SKIPJACK is a good pseudo random number generator. These tests were
- run on a Cray YMP at NSA. The results showed that SKIPJACK behaves
- like a random function and that ciphertext bits are not correlated with
- either key bits or plaintext bits. Appendix A gives more details.
-
- 4.2.2 Differential Cryptanalysis
-
- Differential cryptanalysis is a powerful method of attack that exploits
- structural properties in an encryption algorithm. The method involves
- analyzing the structure of the algorithm in order to determine the
- effect of particular differences in plaintext pairs on the differences
- of their corresponding ciphertext pairs, where the differences are
- represented by the exclusive-or of the pair. If it is possible to
- exploit these differential effects in order to determine a key in less
- time than with exhaustive search, an encryption algorithm is said to be
- susceptible to differential cryptanalysis. However, an actual attack
- using differential cryptanalysis may require substantially more chosen
- plaintext than can be practically acquired.
-
- We examined the internal structure of SKIPJACK to determine its
- susceptibility to differential cryptanalysis. We concluded it was not
- possible to perform an attack based on differential cryptanalysis in
- less time than with exhaustive search.
-
- 4.2.3 Weak Key Test
-
- Some algorithms have "weak keys" that might permit a shortcut
- solution. DES has a few weak keys, which follow from a pattern of
- symmetry in the algorithm. We saw no pattern of symmetry in the
- SKIPJACK algorithm which could lead to weak keys. We also
- experimentally tested the all "0" key (all 80 bits are "0") and the all
- "1" key to see if they were weak and found they were not.
-
- 4.2.4 Symmetry Under Complementation Test
-
- The DES satisfies the property that for a given plaintext-ciphertext
- pair and associated key, encryption of the one's complement of the
- plaintext with the one's complement of the key yields the one's
- complement of the ciphertext. This "complementation property" shortens
- an attack by exhaustive search by a factor of two since half the keys
- can be tested by computing complements in lieu of performing a more
- costly encryption. We tested SKIPJACK for this property and found that
- it did not hold.
-
- 4.2.5 Comparison with Classified Algorithms
-
- We compared the structure of SKIPJACK to that of NSA Type I algorithms
- used in current and near-future devices designed to protect classified
- data. This analysis was conducted with the close assistance of the
- cryptographer who developed SKIPJACK and included an in-depth
- discussion of design rationale for all of the algorithms involved.
- Based on this comparative, structural analysis of SKIPJACK against
- these other algorithms, and a detailed discussion of the similarities
- and differences between these algorithms, our confidence in the basic
- soundness of SKIPJACK was further increased.
-
- Conclusion 2: There is no significant risk that SKIPJACK can be broken
- through a shortcut method of attack.
-
-
- 5. Secrecy of the Algorithm
-
- The SKIPJACK algorithm is sensitive for several reasons. Disclosure of
- the algorithm would permit the construction of devices that fail to
- properly implement the LEAF, while still interoperating with legitimate
- SKIPJACK devices. Such devices would provide high quality
- cryptographic security without preserving the law enforcement access
- capability that distinguishes this cryptographic initiative.
- Additionally, the SKIPJACK algorithm is classified SECRET NOT
- RELEASABLE TO FOREIGN NATIONALS. This classification reflects the high
- quality of the algorithm, i.e., it incorporates design techniques that
- are representative of algorithms used to protect classified
- information. Disclosure of the algorithm would permit analysis that
- could result in discovery of these classified design techniques, and
- this would be detrimental to national security.
-
- However, while full exposure of the internal details of SKIPJACK would
- jeopardize law enforcement and national security objectives, it would
- not jeopardize the security of encrypted communications. This is
- because a shortcut attack is not feasible even with full knowledge of
- the algorithm. Indeed, our analysis of the susceptibility of SKIPJACK
- to a brute force or shortcut attack was based on the assumption that
- the algorithm was known.
-
- Conclusion 3: While the internal structure of SKIPJACK must be
- classified in order to protect law enforcement and national security
- objectives, the strength of SKIPJACK against a cryptanalytic attack
- does not depend on the secrecy of the algorithm.
-
-
-
- Article 18517 of sci.crypt:
- Path: lynx.unm.edu!fmsrl7!destroyer!gatech!concert!rutgers!att-out!cbnewsh!cbne
- From: wcs@anchor.ho.att.com (Bill Stewart +1-908-949-0705)
- Newsgroups: sci.crypt
- Subject: Re: SKIPJACK Review, Interim Report
- Message-ID: <WCS.93Aug2043053@rainier.ATT.COM>
- Date: 2 Aug 93 09:30:53 GMT
- References: <1993Aug1.220927.4510@guvax.acc.georgetown.edu>
- <23i6hm$2ov@charm.magnus.acs.ohio-state.edu>
- Sender: news@cbnewsh.cb.att.com (NetNews Administrator)
- Organization: Electronic Birdwatching Society
- Lines: 57
- In-Reply-To: jebright@magnus.acs.ohio-state.edu's message of 2 Aug 1993 04:52:0
- Nntp-Posting-Host: rainier.ho.att.com
-
- Thanks to Dr. Denning for posting the interim report. On a brief read-through,
- I haven't got much substantive to say yet, so I'll depart from my
- normal procedures and not say much :-) Three concerns, though:
- - While it's good that the whole system is being evaluated and not
- just Skipjack, I'm concerned about the propaganda value of
- releasing a report that says "the committee says it's just fine",
- while will presumably be abused in attempts to railroad
- standards committees into specifying Clipper.
- - While the analysis looked at the design techniques for SkipJack
- and other classified algorithms, it did *not* address the issue of
- why, if SkipJack is so strong, it's only approved for unclassfied
- material, while the other algorithms can be used for classified.
- Is it weaker in some way? Key length alone doesn't count...
- - Key length adequacy - DES can be brute-forced now, at substantial expense;
- two different designs have been published for ~$30M, 1-day search.
- The report makes a big deal about how 24 bits longer key means
- about 36 years longer life if computer power keeps doubling every
- 1.5 years, though it's speculative how long that will continue,
- and makes a big deal about how 80-bit keys are unsearchable now.
- An 80-bit key is still vulnerable in most of our lifetimes;
- a 128-bit key really is not.
-
- In article <23i6hm$2ov@charm.magnus.acs.ohio-state.edu> jebright@magnus.acs.ohi
- >Additionally, the SKIPJACK algorithm is classified SECRET NOT
- >RELEASABLE TO FOREIGN NATIONALS. This classification reflects the high
- >quality of the algorithm, i.e., it incorporates design techniques that
- >are representative of algorithms used to protect classified
- >information. Disclosure of the algorithm would permit analysis that
- >could result in discovery of these classified design techniques, and
- >this would be detrimental to national security.
-
- Wasn't it a Dr. Denning that said secrecy of algorithms should NOT be
- used to build secure cryto? History certainly indicates this.
- Is all of our crypto based on a house of cards?
-
- She's making another point here - this is using secrecy to prevent
- *other* people from building secure crypto, not just to obscure
- current crypto algorithms. However, calling that "detrimental to
- national security" would be blatant political propaganda.....
-
- Realistically, though, there may have been design mistakes in current
- crypto algorithms used for classified information; the NSA isn't
- infallible, and they may have missed something. And disclosing the
- algorithm would increase the possibility of any mistakes being discovered.
-
- >However, while full exposure of the internal details of SKIPJACK would
- >jeopardize law enforcement and national security objectives, it would
-
- I notice that this refers to "national security objectives" rather
- than "national security", and doesn't say *who* set these objectives.
-
- More later...
- --
- # Pray for peace; Bill
- # Bill Stewart 1-908-949-0705 wcs@anchor.att.com AT&T Bell Labs 4M312 Holmdel N
- # White House Comment Line 1-202-456-1111 fax 1-202-456-2461
- # ROT-13 public key available upon request
-
-
- Article 18525 of sci.crypt:
- Newsgroups: sci.crypt
- Path: lynx.unm.edu!fmsrl7!ukma!usenet.ins.cwru.edu!agate!library.ucla.edu!ddsw1
- From: george@tessi.com (George Mitchell)
- Subject: Re: SKIPJACK Review, Interim Report
- Message-ID: <1993Aug2.170306.25724@tessi.com>
- Organization: Test Systems Strategies, Inc., Beaverton, Oregon
- References: <1993Aug1.220927.4510@guvax.acc.georgetown.edu>
- Date: Mon, 2 Aug 1993 17:03:06 GMT
- Lines: 35
-
- Thank you, Dr. Denning, for posting the Skipjack report in a timely
- manner. On the whole, it looks like good news. In particular, I was
- happy to see these paragraphs:
-
- >5. Secrecy of the Algorithm
-
- >The SKIPJACK algorithm is sensitive for several reasons. Disclosure of
- >the algorithm would permit the construction of devices that fail to
- >properly implement the LEAF, while still interoperating with legitimate
- >SKIPJACK devices. Such devices would provide high quality
- >cryptographic security without preserving the law enforcement access
- >capability that distinguishes this cryptographic initiative.
- [. . .]
- >However, while full exposure of the internal details of SKIPJACK would
- >jeopardize law enforcement and national security objectives, it would
- >not jeopardize the security of encrypted communications. This is
- >because a shortcut attack is not feasible even with full knowledge of
- >the algorithm. Indeed, our analysis of the susceptibility of SKIPJACK
- >to a brute force or shortcut attack was based on the assumption that
- >the algorithm was known.
-
- >Conclusion 3: While the internal structure of SKIPJACK must be
- >classified in order to protect law enforcement and national security
- >objectives, the strength of SKIPJACK against a cryptanalytic attack
- >does not depend on the secrecy of the algorithm.
-
- Why did this make me so happy, when I am generally so displeased
- with the Clipper initiative and the key-escrow scheme?
-
- Because the disclosure of the Skipjack algorithm is INEVITABLE.
-
- You can bet that {the top bad guys of the moment} will pry loose
- the algorithm within a year. While they may not be public-spirited
- enough to immediately release it to the rest of us, eventually
- someone will. -- George Mitchell (george@tessi.com)
-
-
-
-
-
- Joining the EFF
- by Esther Dyson
- Publisher of Release 1.0
-
- The Electronic Frontier Foundation is probably best--but incorrectly--
- known as "Mitch Kapor's organization to defend computer hackers." In
- fact, the basic message of the Foundation, "There's a new world coming.
- Let's make sure it has rules we can live with." These rules will
- establish the rights and also the responsibilities of the users of the
- electronic infrastructure- which means, eventually, all of us.
-
- The Foundation's most visible efforts, yes, involve the defense of
- people charged with various forms of electronic trespass and damage.
- This is not to say that there's no such thing as illegal hacking, but
- that not all hacking is illegal. Many hackers' rights are abridged when
- they are arrested by government agents who don't understand how a
- computer works. There's a certain fear of the unknown that makes people
- suspect the worst of a supposed "computer criminal." Searches have been
- overly broad, and charges ridiculously overstated. Moreover, innocent
- bystanders are hurt too, when bulletin boards are closed down and their
- means of communication with each other is disrupted.
-
- Sentences are also unduly harsh: Consider the proposed prohibition on
- Robert Riggs' use of a computer after his release from prison. The
- computer is not a magic, deadly instrument but rather something closer
- to a telephone. Many criminals plan their crimes by telephone or even
- commit telephone farud, but they don't get barred from telephone usage
- thereafter. Says EFF: "Such restrictions tend to promote the notion that
- computers are inherently dangerous...[and that] access [to them falls
- properly within the scope of government action."
-
- The EFF also advocates government funding for the National Research and
- Education Network, and passage of bills to do 80 currently in the Senate
- and House. That doesn't mean that NREN would be the only thing going,
- but it would be a spur to and resource for private efforts. Certainly
- such a network should exist, but what's the best way to get it done?
- Should access be subsidized for the poor or distant, as it was for
- telephone service and still is for postal service? Should the subsidies
- be direct, or should they go to users, or should they be achieved
- through regulation?
-
- Perhaps these questions don't have absolute answers, just as the
- telephone business has evolved through variety of forms (not always
- gracefully, to be sure). Perhaps we should start with a subsidized
- network that ultimately will pay its way! Although the EFF has
- positions on these issues, its major concern is that the public take
- part in addressing them, rather than leaving decisions up to a handful
- of bureaucrats and interested parties.
-
- Beyond that, there are important issues to consider and resolve, such as
- the definition and protection of Constitutional rights including
- privacy, free speech and assembly. In some cases, its more important to
- have laws that are clear than precisely what those laws are. The world
- can adjust to most laws, as long as they make some sense and are
- consistent. Most interesting right now is the delicate tension over the
- classification of network services such as Compuserve and Prodigy. Are
- they publishers, liable for the information they disseminate, or
- utilities and common carriers, required to carry anything for the public
- at large -and therefore not liable for its content? Or is this a false
- dichotomy (as AMIX's Phil Salin asserts): For example, a BBS might be
- like a bookstore: free to select the books it stocks and sells, but not
- responsible for their content individually (i.e., for libel, say). Nor
- is the bookstore responsible for what anyone says inside its walls. Yet
- some "adult" bookstores and record stores have been closed by local
- legal actions. The precedents are muddy.
-
- Finally, there's the awkward question of how to make the network good
- for people without stuffing culture down unwilling throats. If you
- believe that broadcast TV is mostly junk and public TV is mostly
- subsidized culture for the well-off, how do we make networks a people's
- medium - real global villages rather than a global TV set or a global
- museum? Will people use them to communicate rather than vegetate if you
- make it easy? Can we regain the community involvement people lost when
- everything became too big and complicated? Are citizens' groups working
- over the net fringe groups, or are they harbingers of how everyone could
- get involved?
-
- I came to this with the benign American assumption that anyone
- apprehended by the police has probably done something wrong; spending
- time in Eastern Europe, watching the LA police videos and learning about
- some of the EFF cases have changed my perspective forever.
-
- I am now a board member of the EFF. But don't worry, Release 1.0 won't
- become a mouthpiece for the EFF. In fact, when Mitch Kapor asked me to
- join, I responded that I was pleased and flattered, but not sure I
- should join; I certainly don't agree with all the views of the other
- board members. "That," said Mitch, "is the point."
-
- In other words, I joined the EFF to help set its agenda, not just to
- help carry it out. and so I strongly urge that you get involved too.
-
- Esther Dyson is the Editor and publisher of Release 1.0, a newsletter
- covering the computer industry, from which this article is reproduced by
- permission.
-
-
-
-
- From owner-cypherpunks@toad.com Sun Aug 1 16:02:48 1993
- id <AA01400>; Sun, 1 Aug 1993 16:02:46 -0600
- Received: from toad.com by relay2.UU.NET with SMTP
- (5.61/UUNET-internet-primary) id AA01251; Sun, 1 Aug 93 17:59:44 -0400
- Received: by toad.com id AA21591; Sun, 1 Aug 93 14:53:32 PDT
- Received: by toad.com id AA21579; Sun, 1 Aug 93 14:53:07 PDT
- Return-Path: <julian@panix.com>
- Received: from panix.com ([198.7.0.2]) by toad.com id AA21575; Sun, 1 Aug 93 14
- Received: by panix.com id AA20354
- (5.65c/IDA-1.4.4 for cypherpunks@toad.com); Sun, 1 Aug 1993 17:52:57 -0400
- From: Julian Dibbell <julian@panix.com>
- Message-Id: <199308012152.AA20354@panix.com>
- Subject: Village Voice sidebars
- To: cypherpunks@toad.com
- Date: Sun, 1 Aug 1993 17:52:57 -0400 (EDT)
- Mime-Version: 1.0
- Content-Type: text/plain; charset=US-ASCII
- Content-Transfer-Encoding: 7bit
- Content-Length: 7407
- Status: OR
-
- Here are the two short sidebars that accompany the Village Voice article on
- Cypherpunks et al. Posted by and with the permission of the author.
-
- The first contains some of the more practical information that Tim May was
- wondering about, though it does not point anyone towards ftp sites, mailing
- lists, or anything as concrete as that. I didn't know whether you all would
- appreciate an influx of "left-biased" :-) crypto-naifs flooding in here as a
- result of my posting the list address, so I refrained. Also didn't think
- advertising locations for PGP was a good idea, given the legal hassles that
- might result to people doing the distribution. But if any of you think I was
- being overscrupulous, I encourage you to write the Voice with further
- information and I will do my best to see the letter gets published.
-
- BUILDING A BETTER MONKEY WRENCH
-
- Contrary to the conventional wisdom of an age gone cuckoo for ``smart''
- technology, Luddism is neither dead nor beside the point -- it's just gotten
- smarter. The Cypherpunks and other cryptography hackers are model
- practitioners of a new, techno-savvy Luddism, implementing and popularizing
- sophisticated gadgets that could short-circuit the awesome surveillance
- capabilities built into cyberspace without harming its equally awesome power
- to connect individuals. Long-term, these brave new tools will do more to
- keep Big Brother out of your business than any legislation can, so you owe
- yourself at least a cursory understanding of how they work. The following
- primer should jump-start you. Read it and get smart.
-
- PUBLIC-KEY CRYPTOGRAPHY: Most encryption schemes require sender and receiver
- to agree on a secret encoding number, or key, before communication. This
- increases vulnerability, since that first message establishing the key can't
- itself be encrypted. Public-key systems, invented in 1975 by Ur-cypherpunk
- Whitfield Diffie along with Martin Hellman, have no such requirement, making
- them ideal for the highly snoopable channels of computer networks. In
- public-key crypto, everybody creates two keys, one published for all the
- world to read, and one kept absolutely secret. Whatever's encrypted with the
- first can only be unlocked with the second. Thus, if you want to send
- someone a secret message there's no need to make prior contact -- you just
- look up that person's public key and use it to encrypt the text. Current
- usage: The free public-key encryption program PGP is one of the most
- popularly deployed crypto tools in the on-line world, with PGP public keys
- rapidly becoming the electronic superhighway's equivalent of vanity plates.
-
- ANONYMOUS REMAILERS: These systems aim to conceal not the contents of a
- message but its source. A remailer is a network-connected computer that
- takes in e-mail, then sends it on to a destination specified in attached,
- encrypted instructions, thus placing a veil between sender and receiver. If
- the message is sent through a chain of even a few remailers, the veil
- quickly becomes rock solid, guaranteeing the sender's anonymity. Current
- usage: The Cypherpunks maintain a working anonymous remailer chain, but the
- most active are the one-hop systems used by participants in public on-line
- discussions of bondage, foot worship, and assorted other predilections they
- might not want their computer-literate boss/parents/neighbors to know
- about.
-
- DIGITAL SIGNATURES: In the fluid world of digital info, how do you verify
- that a message is really from whom it claims it's from? Turn public-key
- cryptography inside out, that's how. Have the sender encrypt the message
- with her private key, then let the receiver try to decrypt it with the
- sender's public key. If the decryption comes out clear, then the sender's
- identity is confirmed -- without revealing her private key or even, if the
- public key is attached to a pseudonymous but otherwise trustworthy on-line
- persona, her physical identity. This is more or less how digital signatures
- work. Current usage: mainly in corporate and bureaucratic settings, though
- all good Cypherpunks try to make a habit of e-signing their e-mail.
-
- ELECTRONIC CASH: Imagine the convenience of credit cards combined with the
- anonymity of cash. Imagine a microchip-equipped debit card that instantly
- deducts transactions from the user's bank account, yet does so without
- revealing the payer's identity to the payee or linking payer and payee in
- the bank's records. Imagine these mechanisms set loose in the world's
- computer nets, converting great chunks of money supply into fast, loose,
- digital e-cash. The wizardry of public-key crypto can make all this happen
- and probably will. Current usage: experimental, mostly. Denmark, however, is
- gearing up to implement an encrypted smart-card system, based on the ideas
- of crypto-hacker David Chaum, who holds patents on most e-money
- applications.
-
- --
-
- TALE FROM THE CRYPTO WARS
-
- The high weirdness of the military's code-busting censorship moves peaked in
- World War II, but didn't end there. It was during the Gulf War, in fact,
- that military censors made one of the strangest additions to their already
- strange list of banned communications: the Navajo language. A small number
- of Navajos, it seems, wanted to send broadcast greetings in their native
- tongue to loved ones stationed overseas, but Armed Forces Radio refused to
- pass the messages along. Once again, the mere possibility of enemy signals
- lurking in the noise was too much for the censors to bear. ``We have a
- responsibility to control what's on the radio,'' said the lieutenant colonel
- in charge, ``and if I don't know what it says then I can't control it.''
-
- In the ripest of ironies, however, it turns out that the only nation
- ever known to have used Navajo as a cover for secret communications was the
- United States itself. Throughout World War II's Pacific campaign, the Marine
- Corps made heavy and effective use of its Navajo codetalker units--teams of
- Navajo radiomen who spoke a slangy, cryptic patois difficult even for
- uninitiated Navajos to grasp, and ultimately impossible for the Japanese to
- decode. Today the codetalkers remain legendary figures on the rez and beyond
- -- legendary enough indeed that New Mexico congressman Bill Richardson,
- wielding the memory of their exploits, finally shamed Armed Forces Radio
- into lifting its ban and letting Navajo greetings reach the Gulf.
-
- It's a familiar story. Prized and feared for its impenetrable otherness,
- Navajo has met the same uneasy fate reserved for all true difference in a
- country that both prides itself on cultural diversity and insistently
- suppresses it. But in its blurring of the lines between language and secret
- code, Navajo's passage through the belly of the military beast hints at one
- way out of America's terminal cultural ambivalence. As arch-Cypherpunk John
- Gilmore has argued, committing to universally accessible encryption is one
- way for our society to finally take the ideal of diversity seriously --
- backing it up ``with physics and mathematics, not with laws,'' and certainly
- not with the lip service it's traditionally honored with. Cryptography could
- guarantee us each a language of our own, which no censor, military or
- otherwise, could hope to silence.
-
-
-
- --
- *********************************************************************
- Julian Dibbell julian@panix.com
- *********************************************************************
-
-
-
-
- From owner-cypherpunks@toad.com Thu Aug 5 22:11:35 1993
- id <AA11372>; Thu, 5 Aug 1993 22:11:23 -0600
- Received: from toad.com by relay2.UU.NET with SMTP
- (5.61/UUNET-internet-primary) id AB06870; Thu, 5 Aug 93 23:58:34 -0400
- Received: by toad.com id AA14703; Thu, 5 Aug 93 20:48:39 PDT
- Received: by toad.com id AA14656; Thu, 5 Aug 93 20:46:37 PDT
- Return-Path: <tcmay@netcom.com>
- Received: from netcom5.netcom.com ([192.100.81.113]) by toad.com id AA14652; Th
- Received: by netcom5.netcom.com (5.65/SMI-4.1/Netcom)
- id AA16297; Thu, 5 Aug 93 20:47:25 -0700
- From: tcmay@netcom.com (Timothy C. May)
- Message-Id: <9308060347.AA16297@netcom5.netcom.com>
- Subject: (fwd) Re: Will SKIPJACK's algorithm get out? (Non-technical)
- To: cypherpunks@toad.com
- Date: Thu, 5 Aug 93 20:47:25 PDT
- X-Mailer: ELM [version 2.3 PL11]
- Status: OR
-
- Here's a posting I did on how Skipjack (which I deliberately called
- "Clipjack") can be likely broken by groups like ours. The anonymous
- remailers, and the alt.whistleblowing group, can be used to publish
- details of the whole Skipjack/Capstone/Mykotronx/MYK-78/etc. ball of
- wax as they become available.
-
- Whether we can actually be the ones to analyze the chips or not is
- immaterial: spreading reports that Clipjack is vulnerable will be
- useful disinformation (reduced confidence, fewer commercial sales,
- more acceptance of more provably strong software-based alternatives,
- etc.)
-
- -Tim
-
-
- Newsgroups: sci.crypt,alt.privacy.clipper
- From: tcmay@netcom.com (Timothy C. May)
- Subject: Re: Will SKIPJACK's algorithm get out? (Non-technical)
- Message-ID: <tcmayCBBJCr.BsK@netcom.com>
- Date: Fri, 6 Aug 1993 03:36:27 GMT
-
-
- Larry Loen (lwloen@rchland.vnet.ibm.com) wrote:
-
- : Myself, I confidently expect to see Skipjack published in some Eurocrypt
- : proceedings or other in the next 4 or 5 years, especially if the darn thing
- : is actually produced in any volumes. There is a decidely
- : different attitude in W. Europe towards this sort of thing.
-
- : It's mostly a question of economics. Will someone, somewhere put out the
- : bucks to do a "tear down" of the chip and figure out how it works. I could
- : imagine some crypto company in Europe doing just that and being also motivate
- : to publish what they find for competitive reasons. . .
-
- Some of us plan to do just this: once "Clipjack" phones are finalized
- and on sale and/or Mykotronx is selling finalized chips, they'll be
- looked at.
-
- I once ran Intel's electron-beam testing lab, so I have some
- familiarity with looking at chips, including ostensibly
- tamper-resistant modules. VLSI Technology is fabbing the chips, using
- a process said to be quite tamper-resistant. We'll see. (While
- publishing the algorithm may or may not be illegal, there's no
- reasonable law saying you can't look at something, unless perhaps it's
- formally classified....will the Clipjack chips have "Top Secret"
- stamped on them? Somehow I can't quite picture this in phones sold
- across the country and outside!)
-
- (I'm not saying it'll be easy to do this reverse-engineering, mind
- you. Between mechanical barriers to access (carbide-like particles in
- the packaging compound to deter grinding), complex-chemistry epoxies
- to deter plasma- and chemical-decapping, various chip-level
- countermeasures (storing bits on floating gates, using multiple layers
- of metal, etc.), the access to the die surface may be very difficult.
- The "smartcard" chip makers have led the way in devising
- tamper-resistant chip processes, though their task is quite a bit
- easier (stopping access to an active chip on an active smartcard, to
- modify the money amounts) than Clipjack faces (stopping any
- examination of the chip topology and programming which would reveal
- the algorithms used)
-
- But given enough samples, enough time, and some
- commitment, the secrets of Clipjack will fall.)
-
- As a "Cypherpunk" (cf. cover of "Wired" #2, "Whole Earth Review" Summer '93,
- and the current (8-2-93) "Village Voice" cover story), I see no reason
- not to publish the details. This'll let other folks build phones and other
- comm systems which spoof or defeat the Clipjack system, especially the
- disgusting and thoroughly un-American "key escrow" system.
-
- Naturally, we'll use our "anonymous remailers" (multiple reroutings of
- messages, with each node decrypting with its key and passing on what's
- left to the next chosen node....diffusion and confusion, a la Chaum's
- 1981 "CACM" paper on "digital mixes") to protect ourselves. No sense
- taking chances that the Feds will view our "liberation" efforts with
- disfavor and hit us with charges they devise (violations of Munitions
- Act, RICO, sedition, etc.). This is how some of our members were able
- to "liberate" secret Mykotoxin documents from the dumpsters of
- Mykotoxin (something the Supremes have said is OK for law enforcement
- to do, by the way) and post them anonymously to our mailing list (I
- believe these docs were then posted to alt.whistleblowers, but they were
- only _mentioned_ on sci.crypt, not actually posted).
-
- I expect at least _three_ separate groups are preparing to break the
- Clipjack algorithm, at least as embodied in the Clipper/Skipjack chips
- that come on the market.
-
- Breaking the system also allows independent observers to see if it
- does in fact contain deliberate weaknesses (though the focus on
- "weaknesses" is secondary to the basic issue of "key escrow" as a
- concept--it is key escrow, especially mandatory key escrow, that is
- the real issue. (Mandatory key escrow is not yet part of law, to be
- fair, but still "in the wind"...we won't really know for a few more
- years whether the "voluntary" key escrow system will become mandatory)
-
- It'll also be interesting to see how Clipjack phone customers react to
- the revelations of the algorithms.
- (CLIPJACK.CRK 88%), (H)elp, (F)ind, (P)gUp, (T)op, (>), More?
-
- Crypto anarchy means never having to say you're sorry.
-
-
- Yours in the struggle,
-
- -Tim May
- --
- ..........................................................................
- Timothy C. May | Crypto Anarchy: encryption, digital money,
- tcmay@netcom.com | anonymous networks, digital pseudonyms, zero
- 408-688-5409 | knowledge, reputations, information markets,
- W.A.S.T.E.: Aptos, CA | black markets, collapse of governments.
- Higher Power: 2^756839 | Public Key: PGP and MailSafe available.
- Note: I put time and money into writing this posting. I hope you enjoy it.
-
-
-
- From cypherpunks-request@toad.com Tue May 25 13:02:18 1993
- id <AA03727>; Tue, 25 May 1993 13:02:14 -0600
- Received: by uucp-gw-2.pa.dec.com; id AA00801; Tue, 25 May 93 11:57:37 -0700
- Received: by toad.com id AA22998; Tue, 25 May 93 11:46:18 PDT
- Return-Path: <sytex!sytex.com!fergp@uunet.UU.NET>
- Received: from relay2.UU.NET by toad.com id AA22994; Tue, 25 May 93 11:46:08 PD
- Received: from spool.uu.net (via LOCALHOST) by relay2.UU.NET with SMTP
- (5.61/UUNET-internet-primary) id AA02788; Tue, 25 May 93 14:46:03 -0400
- Received: from sytex.UUCP by spool.uu.net with UUCP/RMAIL
- (queueing-rmail) id 144431.19711; Tue, 25 May 1993 14:44:31 EDT
- Received: by sytex.com (Smail3.1.28.1 #1)
- id m0ny3YT-00019OC; Tue, 25 May 93 14:16 EDT
- To: cypherpunks@toad.com
- Subject: Bill O' Rights
- From: fergp@sytex.com (Paul Ferguson)
- Message-Id: <J6a54B1w165w@sytex.com>
- Date: Tue, 25 May 93 14:13:42 EDT
- Organization: Sytex Communications, Inc
- Status: OR
-
- I remember reading this in the March ACM and thinking,"Man. He hit
- that right on the head." When I ran across this transcript in Computer
- Select earlier this morning (while looking for various encryption
- products, no less), I thought those of you who had not already seen it
- would be struck by John Perry's insights. BTW, I also have the full
- transcripts of Dorothy Denning's, William A. Bayse's (Assistant
- Director, FBI Technical Services Division) and Lewis M. Branscomb's
- (Harvard University) articles which appeared in the same issue with
- regards to Digital Telephony, if anyone cares for me to post them.
- Looking back on the progression of events, beginning with the debate
- of the Digital Telephony proposal and subsequently the proposal
- currently (officially) referred to as the "Key Escrow" Chip (and its
- associated escrow scheme), I can't help but surmise that the whole
- ball of wax is geared towards allowing the Government the ability to
- effectively eavesdrop on its citizens communications in the face of
- advancing technology, without regard to privacy matters.
-
- 8<---- Begin forwarded text ---------
-
- Journal: Communications of the ACM March 1993 v36 n3 p21(3)
- * Full Text COPYRIGHT Association for Computing Machinery Inc.1993.
- ----------------------------------------------------------------------
- Title: Bill o' rights. (impact of technology on basic civil rights;
- humor) (Electronic Frontier)
- Author: Barlow, John Perry
-
- ----------------------------------------------------------------------
- Full Text:
-
- *Note* Only Text is presented here; see printed issues for graphics.
-
- It has been almost three years since I first heard of the Secret Service
- raids on Steve Jackson Games and the cyberurchins from the Legion of
- Doom. These federal exploits, recently chronicled in Bruce Sterling's
- book Hacker Crackdown, precipitated the formation of the Electronic
- Frontier Foundation and kicked loose an international digital liberties
- movement which is still growing by leaps and conferences.
-
- I am greatly encouraged by the heightened awareness among the citizens
- of the Global Net of our rights, responsibilities, and opportunities.
- I am also heartened that so many good minds now tug at the legal,
- ethical, and social riddles which come from digitizing every damned
- thing. The social contract of Cyberspace is being developed with
- astonishing rapidity, considering that we are still deaf, dumb, and
- disembodied in here.
-
- Meanwhile, back in the Physical World, I continue to be haunted by the
- words of the first lawyer I called on behalf of Steve Jackson, Phiber
- Optik, and Acid Phreak back in the spring of 1990. This was Eric
- Lieberman of the prestigious New York civil liberties firm Rabinowitz,
- Boudin, Standard, Krinsky, and Lieberman. I told him how the Secret
- Service had descended on my acquaintances and taken every scrap of
- circuitry or magnetized oxide they could find. This had included not
- only computers and disks, but clock radios and audio cassettes.
-
- I told him that, because no charges had been filed, the government was
- providing their targets no legal opportunity to recoup their confiscated
- equipment and data. (In fact, most of the victims of Operation Sun
- Devil still have neither been charged nor had their property returned to
- them.)
-
- [This issue has been somewhat resolved with the recent ruling in
- favor of Steve Jackson and the subsequent award of damages.]
-
- The searches were anything but surgical and the seizures appeared
- directed less at gathering evidence than inflicting punishment without
- the bothersome formality of a trial. I asked Lieberman if the Secret
- Service might not be violating the Fourth Amendment's assurance of "The
- right of the people to be secure in their persons, houses, papers, and
- effects, against unreasonable searches and seizures."
-
- He laughed bitterly. "I think if you take a look at case law for the
- last ten years or so, you will find that the Fourth Amendment has
- pretty much gone away," he said.
-
- I did. He was right. A lot of what remained of it was flushed a year
- later when the Rehnquist Court declared that in the presence of
- "probable cause" ...a phrase of inviting openness...law enforcement
- officials could search first and obtain warrants later.
-
- Furthermore, I learned that through such sweeping prosecutorial
- enablements as RICO and Zero Tolerance, the authorities could entract
- their own unadjudicated administrative "fines" by keeping much of what
- they seized for their own uses.
-
- (This incentive often leads to disproportionalities between "punishment"
- and "crime" which even Kafka might have found a bit over the top. I
- know of one case in which the DEA acquired a $14 million Gulfstream
- bizjet from a charter operator because one of its clients left half a
- gram of cocaine in its washroom.)
-
- I tried to image a kind of interactive Bill of Rights in which
- amendments would fade to invisibility as they became meaningless, but
- I knew that was hardly necessary. The citizens of Stalin's Soviet
- Union had a constitutional guarantee of free expression which
- obviously, like our own, allowed some room for judicial
- interpretation.
-
- It occurred to me then that a more honest approach might be to maintain
- a concordant Bill of Rights, running in real time and providing
- up-to-the-minute weather reports from the federal bench, but I never got
- around to it.
-
- Recently I started thinking about it again. These thoughts were
- inspired partly by Dorothy Denning's apology for the FBI's digital
- telephony proposal (which appears in this issue). I found her analysis
- surprisingly persuasive, but I also found it fundamentally based on an
- assumption I no longer share: the ability of the Bill of Rights to
- restrain government, now or in the future.
-
- The men who drafted the U.S. Constitution and its first ten amendments
- knew something that we have largely forgotten: Governments exist to limit
- freedom. That's their job. And to the extent that utterly unbridled
- liberty seems to favor the reptile in us, a little government is not
- such a bad thing. But it never knows when to quit. As there is no
- limit to either human imagination or creativity in the wicked service
- of the Self, so it is always easy for our official protectors to envision
- new atrocities to prevent.
-
- Knowing this, James Madison and company designed a government which was
- slightly broken up front. They intentionally created a few wrenches to
- cast into the works, and these impediments to smooth governmental
- operation were the Bill of Rights.
-
- Lately though, we find ourselves living in a world where the dangers we
- perceive are creatures of information rather than experience. Since
- the devil one knows is always less fearsome than the worst one can
- imagine, there is no limit to how terrifying or potent these dangers can
- seem.
-
- Very few of us, if any, have ever felt the malign presence of a real,
- live terrorist or drug lord or Mafia capo or dark-side hacker. They
- are projected into our consciousness by the media and the government,
- both of which profit directly from our fear of them. These enemies are,
- in our (tele)visions of them, entirely lacking in human decency or
- conscience. There is no reason they should be mollycoddled with
- constitutional rights.
-
- And so, we have become increasingly willing to extend to government what
- the Founding Fathers would not: real efficiency. The courts have been
- updating the Bill of Rights to fit modern times and perils, without
- anyone having to go through the cumbersome procedure of formal amendment.
-
- The result, I would suggest with only a little sarcasm or hyperbole, has
- come to look something like this:
-
- Bill O' Rights
-
- AMENDMENT 1
-
- Congress shall encourage the practice of Judeo-Christian religion by its
- own public exercise thereof and shall make no laws abridging the freedom
- of responsible speech, unless such speech is in a digitized form or
- contains material which is copyrighted, classified, proprietary, or
- deeply offensive to non-Europeans, nonmales, differently abled or
- alternatively preferenced persons; or the right of the people peaceably
- to assemble, unless such assembly is taking place on corporate or
- military property or within an electronic environment, or to make
- petitions to the government for a redress of grievances, unless those
- grievances relate to national security.
-
- AMENDMENT 2
-
- A well-regulated militia having become irrelevant to the security of the
- state, the right of the people to keep and bear arms against one another
- shall nevertheless remain uninfringed, excepting such arms as may be
- afforded by the poor or those perferred by drug pushers, terrorists, and
- organized criminals, which shall be banned.
-
- AMENDMENT 3
-
- No soldier shall, in time of peace, be quartered in any house, without
- the consent of the owner, unless that house is thought to have been used
- for the distribution of illegal substances.
-
- AMENDMENT 4
-
- The right of the people to be secure in their persons, houses, papers and
- effects against unreasonable searches and seizures, may be suspended to
- protect public welfare, and upon the unsupported suspicion of law
- enforcement officials, any place or conveyance shall be subject to
- immediate search, and any such places or conveyances or property within
- them may be permanently confiscated without further judicial proceeding.
-
- AMENDMENT 5
-
- Any person may be held to answer for a capital, or otherwise infamous
- crime involving illicit substances, terrorism, or child pornography, or
- upon any suspicion whatever; and may be subject for the same offense to
- be twice put in jeopardy of life or limb, once by the state courts and
- again by the federal judiciary; and may be compelled by various means,
- including the forced submission of breath samples, bodily fluids, or
- encryption keys, to be a witness against himself, refusal to do so
- constituting an admission of guilt; and may be deprived of life, liberty,
- or property without further legal delay; and any property thereby
- forfeited shall be dedicated to the discretionary use of law enforcement
- agencies.
-
- AMENDMENT 6
-
- In all criminal prosecutions, the accused shall enjoy the right to a
- speedy and private plea bargaining session before pleading guilty. He is
- entitled to the assistance of underpaid and indifferent counsel to
- negotiate his sentence, except where such sentence falls under federal
- mandatory sentencing requirements.
-
- AMENDMENT 7
-
- In suits at common law, where the contesting parties have nearly
- unlimited resources to spend on legal fees, the right of trail by jury
- shall be preserved.
-
- AMENDMENT 8
-
- Sufficient bail may be required to ensure that dangerous criminals will
- remain in custody, where cruel punishments are usually inflicted.
-
- AMENDMENT 9
-
- The enumeration in the Constitution of certain rights, shall not be
- construed to deny or disparage others which may be asserted by the
- government as required to preserve public order, family values, or
- national security.
-
- AMENDMENT 10
-
- The powers not delegated to the U.S. by the Constitution, shall be
- reserved to the U.S. Departments of Justice and Treasury, except when
- the states are willing to forsake federal funding.
-
- [John P. Barlow is a technological author and the cofounder (with Mitch
- Kapor) of the Electronic Frontier Foundation. He currently lives in
- Wyoming, New York and "in Cyberspace." His email address is barlow
- @eff.org.]
-
- Paul Ferguson | The future is now.
- Network Integrator | History will tell the tale;
- Centreville, Virginia USA | We must endure and struggle
- fergp@sytex.com | to shape it.
-
- Stop the Wiretap (Clipper/Capstone) Chip.
-
-
-
- From owner-cypherpunks@toad.com Sun Aug 1 20:04:57 1993
- id <AA03157>; Sun, 1 Aug 1993 20:04:54 -0600
- Received: from toad.com by relay2.UU.NET with SMTP
- (5.61/UUNET-internet-primary) id AA08338; Sun, 1 Aug 93 21:54:43 -0400
- Received: by toad.com id AA27620; Sun, 1 Aug 93 18:47:25 PDT
- Received: by toad.com id AA27616; Sun, 1 Aug 93 18:47:16 PDT
- Return-Path: <sytex!sytex.com!fergp@uunet.UU.NET>
- Received: from relay2.UU.NET by toad.com id AA27612; Sun, 1 Aug 93 18:47:06 PDT
- Received: from spool.uu.net (via LOCALHOST) by relay2.UU.NET with SMTP
- (5.61/UUNET-internet-primary) id AA07276; Sun, 1 Aug 93 21:47:03 -0400
- Received: from sytex.UUCP by uucp1.uu.net with UUCP/RMAIL
- (queueing-rmail) id 214530.12641; Sun, 1 Aug 1993 21:45:30 EDT
- Received: by sytex.com (Smail3.1.28.1 #1)
- id m0oMo6b-0000wjC; Sun, 1 Aug 93 20:50 EDT
- To: cypherpunks@toad.com
- Subject: NSA: The Eyes of Big Brother
- From: fergp@sytex.com (Paul Ferguson)
- Message-Id: <JLqm8B2w165w@sytex.com>
- Date: Sun, 01 Aug 93 20:44:54 EDT
- Organization: Sytex Communications, Inc
- Status: OR
-
- reprinted without permission from Claustrophobia:
-
- Claustrophobia
- August 1993
- Volume 2, Number 7
-
-
-
- NSA: The Eyes of Big Brother
- by Charles Dupree
-
- -----------------------------------------------------------
-
- The historical of the National Security Agency (NSA) presented
- here includes and depends on information reported in three books.
- The vast majority of data on the National Security Agency comes
- from James Bamford's book The Puzzle Palace [1982]; all
- quotations are taken from Bamford unless otherwise noted. As Tim
- Weiner says, this book is "The best -- the only -- history of the
- NSA." Material about NSA's secret funding comes entirely from
- Weiner's Blank Check [1990], which also provided budget estimates
- and supporting material for other sections. The CIA and the Cult
- of Intelligence by Victor Marchetti and John D. Marks [1980
- edition, originally published 1974], provided background
- information and a glimpse of the NSA from within the intelligence
- community but outside the agency itself.
- --------------------------------------------------------------
-
- The oppressive atmosphere of Orwell's 1984 arises from the
- omnipresence of Big Brother, the symbol of the government's
- concern for the individual. Big Brother controls the language,
- outlawing words he dislikes and creating new words for his
- favorite concepts. He can see and hear nearly everything, public
- or private. Thus he enforces a rigid code of speech and action
- that erodes the potential for resistance and reduces the need for
- force. As Noam Chomsky says, propaganda is to democracy what
- violence is to totalitarianism. Control thoughts, and you can
- easily control behavior.
-
- U.S. history affords a prime example in the era named after
- Senator Joseph McCarthy, though he had many supporters in his
- attack on freedom of thought and speech. Perhaps his most
- powerful friend was J. edgar Hoover, who fed him material from
- Federal Bureau of Investigation (FBI) files (some of it true)
- which he used to attack individuals for their supposed political
- leanings. By the time of Watergate, the Central Intelligence
- Agency (CIA) had become at least as notorious as the FBI, due
- largely to its assassinations of foreign leaders and support for
- military coups around the world.
-
- The Creation of the NSA
-
- Budgetary authority for the National Security Agency (NSA)
- apparently comes from the Central Intelligence Act of 1949. This
- act provides the basis for the secret spending program known as
- the black budget by allowing any arm of the government to
- transfer money to the CIA "without regard to any provisions of
- the law," and allowing the CIA to spend its funds as it sees fit,
- with no need to account for them.
-
- Congress passed the C.I.A. Act despite the fact that only the
- ranking members of the Senate and House Armed Services Committees
- knew anything about its contents; the remaining members of
- Congress were told that open discussion, or even clear
- explanation, of the bill would be counterproductive. There were
- complaints about the secrecy; but in the end the bill passed the
- House by a vote of 348-4, and the Senate by a majority voice
- vote.
-
- The NSA's estimated $10 billion annual allocation (as of 1990) is
- funded entirely through the black budget. Thus Congress
- appropriates funds for the NSA not only without information on
- the agency's plans, but without even a clear idea of the amount
- it appropriates; and it receives no accounting of the uses to
- which the funds were put. This naturally precludes any debate
- about the direction or management of such agencies, effectively
- avoiding public oversight while spending public funds. (Weiner
- notes the analogy to "Taxation without representation.")
-
- Watching and Listening
-
- "The NSA has also spent a great deal of time and money spying on
- American citizens. For 21 years after its inception it tracked
- every telegram and telex in and out of the United States, and
- monitored the telephone conversations of the politically
- suspect." (Weiner, Blank Check)
-
- Due to its unique ability to monitor communications within the
- U.S. without a warrant, which the FBI and CIA cannot legally do,
- NSA becomes the center of attempts to spy on U.S. citizens.
- Nominally this involves only communications in which at least one
- terminal is outside the U.S., but in practice target lists have
- often grown to include communications between U.S. citizens
- within the country. And political considerations have sometimes
- become important.
-
- During the Nixon administration, for example, various agencies
- (e.g., FBI, CIA, Secret Service) requested that the NSA provide
- all information it encountered showing that foreign governments
- were attempting to influence or controls activities of U.S.
- anti-war groups, as well as information on civil rights, draft
- resistance/evasion support groups, radical-related media
- activities, and so on, "where such individuals have some foreign
- connection," probably not that uncommon given the reception such
- groups usually receive at home. Clearly it would have been
- illegal for those agencies to gather such information themselves
- without warrants, but they presumably believed that the NSA was
- not similarly restricted when they included on their watch lists
- such as Nixonian bugaboos as Eldridge Cleaver, Abbie Hoffman,
- Jane Fonda, Joan Biaz, Dr. Benjamin Spock, and the Rev. ralph
- Abernathy. Presumably the name of Dr, Martin Luther King, Jr.,
- was removed from the list the year Nixon was elected; certainly
- it was a targeted name before that time.
-
- It is not feasible to determine in advance which telegrams and
- telephone calls will be among those the NSA is tasked with
- intercepting. Therefore, the NSA is normally reduced to recording
- all traffic on lines it is monitoring, and screening this traffic
- (by computer when possible) to catch targeted communications.
- This is called the "vacuum-cleaner approach."
-
- Also basic to this method is the "watch list" of groups and
- individuals whose communications should be "targeted." When a
- target is added to the watch list, NSA's computers are told to
- extract communications to, from, or about the target; the agency
- can then examine the selected communications and determine
- whether they constitute intelligence data.
-
- This list of targets usually expands to include all members of
- targeted groups plus individuals and groups with whom they
- communicate; thus it has a tendency to grow rapidly if not
- checked. Some requests seems a bit astonishing: during the
- presidency of Richard Nixon, a Quaker, J. Edgar Hoover requested
- "complete surveillance of all Quakers in the United States"
- because he thought they were shipping food and supplies to
- Southeast Asia.
-
- Project Shamrock
-
- Project Shamrock was initiated in 1945 by the Signal Security
- Agency (SSA), which eventually merged into the NSA. Until the
- project was terminated in 1975 to prevent investigation, Shamrock
- involved NSA (and its predecessors) in communications collection
- activities that would be illegal for agencies such as the CIA or
- FBI.
-
- Under Shamrock, the international branches of RCA, ITT, and
- Western Union provided access by SSA, and its successor NSA, to
- certain telegrams sent by those companies. each company's counsel
- recommended against involvement on legal grounds; each company
- requested the written opinion of the Attorney General that it was
- not making itself liable to legal action. However, none of them
- received anything in writing from anyone in the government, and
- they all cooperated without it. (They did get a verbal assurance
- from the first Secretary of Defense, James Forrestal, who said he
- was speaking for the President; thus they may have been concerned
- at his resignation just over a year later, his hospitalization
- within a week suffering from depression, anxiety, and paranoia,
- and his suicide less than two months later.)
-
- As Shamrock grew, and the NSA began to develop its own means of
- intercepting communications, the watch list approach became the
- accepted standard, since nothing less was effective or
- worthwhile. the intelligence community became aware that it could
- enter a name on the watch list more or less at will, and it would
- soon receive the requested material, marked classified, and
- gathered in within (or perhaps under cover of) the law.
-
- The Huston Plan
-
- The Huston Plan, formally known as "Domestic Intelligence
- Gathering Plan: Analysis and Strategy," was submitted in July
- 1970 to President Nixon. The goal of the plan was to relax some
- restrictions on intelligence gathering, apparently those of NSCID
- No. 6. Some parts of the intelligence community felt that these
- relaxations would assist their efforts. The proposals included:
-
- o allowing the NSA to monitor "communications of U.S. citizens
- using international facilities" (presumably facilities located in
- the U.S., since the NSA already had authority to monitor such
- communications if at least one terminal was outside U.S.
- territory)
-
- o intensifying "coverage of individuals and groups in the United
- States who pose a major threat to the internal security"
-
- o modifying restrictions "to permit selective use of
- [surreptitious entry] against other urgent and high priority
- internal security targets" as well as to procure "vitally needed
- foreign cryptographic material," which would have required the
- FBI to accept warrantless requests for such entries from other
- agencies ("Use of this technique is clearly illegal: it amounts
- to burglary. It is also highly risky and could result in great
- embarrassment if exposed. However, it is also the most fruitful
- tool and can produce the type of intelligence which cannot be
- obtained in any other fashion.")
-
- President Nixon approved this plan over the objection of J. Edgar
- Hoover and without the knowledge of Attorney General Mitchell.
- Hoover went to Mitchell, who had been left out of the entire
- process, and was consequently angry; Mitchell convinced Nixon to
- withdraw his approval 13 days after giving it.
-
- Project Minaret
-
- The size and complexity of the domestic watch list program became
- a problem, since it bordered on illegality. Project Minaret was
- established on July 1, 1969, to "privid[e] more restrictive
- control" on the domestic products, and "to restrict the knowledge
- that information is being collected and processed by the National
- Security Agency." The agency knew it was close to legal
- boundaries, and wanted to protect itself.
-
- Minaret continued until the fall of 1973, when Attorney General
- Richardson became aware of the domestic watch list program and
- ordered such activities stopped. As the Watergate drama played
- out, Congress began to hear about the NSA's projects, and within
- two years formally inquiring about them
-
- Uncontrolled Activities
-
- Like most intelligence agencies, the NSA uses words such as
- "interrupt" and "target" in a technical sense with a precise but
- often classified definition. This specialized language makes it
- difficult to legislate or oversee the activities involved. For
- instance, in NSA terms a conversation that is captured, decoded
- if necessary, and distributed to the requesting agency is not
- considered to be the product of eavesdropping unless one of the
- parties to the conversation is explicitly targeted. However, the
- NSA does not depend on semantic defences; it can also produce
- some legal arguments for exempting itself from normal
- requirements.
-
- On the rare occasions when NSA officials have to testify before
- Congress, they have claimed a mandate broad enough to require a
- special legal situation. In 1975, the NSA found its activities
- under scrutiny by the Senate Intelligence Committee, chaired by
- Frank Church; the House Select Committee on the Intelligence
- Community, under Otis Pike; and the House Government Operations
- Subcommittee on Government Information and Individual Rights, led
- by Bella Abzug. The agency was notably consistent in responding
- to those committees.
-
- When Lt. Gen. Lew Allen appeared before the Pike committee, he
- pointed out that it was the first time an NSA director had been
- required to testify in open session. Two days earlier, CIA
- director William Colby had testified that the NSA was not always
- able to separate the calls of U.S. citizens from the traffic it
- monitors. The general counsel of the NSA, Roy Banner, joined
- Allen as witness. he was asked if, in his opinion, the NSA could
- legally intercept overseas telephone calls from U.S. citizens
- despite the legal prohibition on wiretapping. He replied, "That
- is correct."
-
- The top three officers of the NSA spoke with a single voice to
- the Church committee. When the committee's chief counsel said to
- Allen, "You believe you are consistent with the statutes, but
- there is not any statute that prohibits your interception of
- domestic communications." When deputy director Buffham was asked
- about the legality of domestic aspects of the Huston plan, he
- said, "Legality? That particular aspect didn't enter into the
- discussions." Counsel Banner responded at least three times to
- similar questions that the program had been legal at the time.
- (Testimony took place on Oct. 29, 1975; Project Shamrock and its
- watch lists were halted in mid-May of that year.)
-
- The Abzug committee tried to get the story from the
- communications corporations that had cooperated in Project
- Shamrock. its hearings in late 1975 were unproductive because RCA
- and ITT informed the committee, two days before hearings began,
- that their executives would not appear without a subpoena; and a
- former FBI agent who had been cooperating was forbidden by his
- old employer from testifying. When the committee reconvened in
- early 1976, it issued subpoenas to three FBI special agents, plus
- one former agent; one NSA employee; and executives from
- international arms of RCA, ITT, and Western Union. President Ford
- prevented the five FBI/NSA people from testifying with a claim of
- executive privilege, and the Attorney general requested that the
- corporations refuse to comply with the subpoenas on the same
- grounds. Their testimony in spite of that request brought Project
- Shamrock to light less than a year after it was quickly
- terminated.
-
- There may have been some legal basis for the NSA claims of
- extra-legal status. Despite having no statutory basis or
- charter, the NSA has considerable statutory protection: various
- statutes, such as the COMINT statute, 18 U.S.C. 798; Public Law
- 86-36; and special provisions of the 1968 Omnibus Crime Control
- and safe Streets Act, exempt it from normal scrutiny, even from
- within the government. Thus the agency may be right in
- interpreting the law to say that it can do anything not
- specifically prohibited by the President of the National Security
- Council.
-
- NSCID No. 6, NSA's secret charter, includes this important
- exemption (according to James Bamford's reconstruction):
-
- "The special nature of Communications Intelligence activities
- requires that they be treated in all respects as being outside
- the framework of other or general intelligence activities.
- Orders, directives, policies, or recommendations of any
- authority of the Executive branch relating to the collection
- ... of intelligence shall not be applicable to Communications
- Intelligence activities, unless specifically so stated and
- issued by competent departmental or agency authority
- represented on the [U.S. Communications Intelligence] Board.
- Other National Security Council Intelligence Directives to the
- Director of Central Intelligence and related implementing
- directives issued by the Director of Central Intelligence shall
- be construed as non-applicable to Communications Intelligence
- unless the National Security Council has made its directive
- specifically applicable to COMINT."
-
- The unchecked ability to intercept and read communications,
- including those of U.S. citizens within the country, would be
- dangerous even if carefully regulated by elected officials held
- to a public accounting.
-
- When the method is available to officials whose names are often
- unknown even to Congress who work for unaccountable agencies
- like the NSA, it is very difficult for the intelligence
- community, the defense community, and the Executive to refrain
- form taking advantage of such easily obtained knowledge.
-
- The lack of any effective oversight of the NSA makes it possible
- for the agency to initiate or expand operations without
- authorization from higher (or even other) authority. Periodic
- meetings of members of the intelligence community do not
- constitute true oversight or public control of government; and
- the same is true of the provision of secret briefings to a small
- number of senior members of the Congress, all chosen by the
- intelligence community and sworn to secrecy.
-
- Oversight of such extensive communications capability is
- important enough; but NSA's capabilities are not necessarily
- limited to intercepting and decrypting communications. The NSA
- can also issue direct commands to military units involved in
- Signals Intelligence (SIGINT) operations, bypassing even the
- Joint Chiefs of Staff. Such orders are subject only to appeal to
- the Secretary of Defense, and provide the NSA with capabilities
- with which it could conceivably become involved in operations
- beyond the collection of intelligence. At least, it does not seem
- to be legally restrained from doing so.
-
- It appears that the only effective restraint on the NSA is the
- direct authority of the President, the National Security Council
- (NSC), the Secretary of Defense, and the U.S. Intelligence Board.
- Since the agency was created and chartered in secret by the
- President and the NSC, it can presumably be modified in secret by
- the same authorities.
-
- Nor is the NSA bereft of means of influence other branches of
- government, as Marchetti and Marks note:
-
- "A side effect of the NSA's programs to intercept diplomatic
- and commercial messages is that rather frequently certain
- information is acquired about American citizens, including
- members of Congress and other federal officials, which can be
- highly embarrassing to those individuals. This type of
- intercept message is handled with even greater care than the
- NSA's normal product, which itself is so highly classified a
- special security clearance is needed to see it."
-
-
- Complete control over a secret agency with at least 60,000 direct
- employees, a $10 billion budget, direct command of some military
- units, and the ability to read all communications would be an
- enormous weapon with which to maintain tyranny were it to arise.
- A President with a Napoleonic or Stalinistic delusion would find
- the perfect tool for the constant supervision of the individual
- by the state in the NSA; not unlike scenarios depicted in novels
- such as Orwell's 1984.
-
- Senator Schweiker of the Church committee asked NSA director Allen
- if it were possible to use NSA's capabilities "to monitor
- domestic conversations within the United States if some person
- with malintent desired to do it," and was probably not surprised
- by Allen's "I suppose that such a thing is technically possible."
- Certainly Senator Church feared the possibility:
-
- "That capability at any time could be turned around on the
- American people and no American would have any privacy left,
- such is the capability to monitor everything: telephone
- conversations, telegrams, it doesn't matter. There would be no
- place to hide. If this government ever became a tyranny, if a
- dictator ever took charge in this country, the technological
- capacity that the intelligence community has given the
- government could enable it to impose total tyranny, and there
- would be no way to fight back, because the most careful effort
- to combine together in resistance to the government, no matter
- how privately it was done, is within the reach of the
- government to know. Such is the capability of this technology
- ...
-
- I don't want to see this country ever go across the bridge. I
- know the capability that is there to make tyranny total in
- America, and we must see it that this agency and all agencies
- that possess this technology operate within the law and under
- proper supervision, so that we never cross over that abyss.
- That is the abyss from which there is no return..."
-
-
- [This concludes part one of our two-part series on the National
- Security Agency. Read part 2. "The NSA and the Clipper
- Initiative," in next month's Claustrophobia.]
-
- --------------------------------------------------------------
-
- Charles Dupree writes user documentation for a Silicon Valley
- software company. In recent years he has become concerned at the
- intrusive power of the National Security Agency; but this is
- probably just the effect of his antisocial habit of reading.
-
-
- 8<------ Snip, snip ---------
-
- For more information on Claustrophobia, contact Dena Bruedigam at
- dbruedig@magnus.acs.ohio-state.edu
-
- Paul Ferguson | "Government, even in its best state,
- Network Integrator | is but a necessary evil; in its worst
- Centreville, Virginia USA | state, an intolerable one."
- fergp@sytex.com | - Thomas Paine, Common Sense
-
- I love my country, but I fear its government.
-
-
-
-
- From owner-cypherpunks@toad.com Thu Aug 5 08:03:18 1993
- id <AA22321>; Thu, 5 Aug 1993 08:03:15 -0600
- Received: from toad.com by relay2.UU.NET with SMTP
- (5.61/UUNET-internet-primary) id AA28023; Thu, 5 Aug 93 09:49:56 -0400
- Received: by toad.com id AA21266; Thu, 5 Aug 93 06:39:51 PDT
- Received: by toad.com id AA21232; Thu, 5 Aug 93 06:38:38 PDT
- Return-Path: <pcw@access.digex.net>
- Received: from access.digex.net ([164.109.10.3]) by toad.com id AA21226; Thu, 5
- Received: by access.digex.net id AA26748
- (5.65c/IDA-1.4.4 for cypherpunks@toad.com); Thu, 5 Aug 1993 09:38:21 -0400
- Date: Thu, 5 Aug 1993 09:38:21 -0400
- From: Peter Wayner <pcw@access.digex.net>
- Message-Id: <199308051338.AA26748@access.digex.net>
- To: cypherpunks@toad.com
- Status: OR
-
-
-
- The Rule of Law and the Clipper Escrow Project
-
- Last Thursday, I attended the first day of the Computer System Ssecurity
- and Privacy Advisory Board in Washington. This is a group of industry
- experts who discuss topics in computer security that should affect the
- public and industry. Some of the members are from users like banks and
- others are from service providing companies like Trusted Information
- Services. Lately, their discussion has centered on the NSA/NIST's
- Clipper/Capstone/Skipjack project and the effects it will have on
- society.
-
- At the last meeting, the public was invited to make comments and they
- were almost unanimously skeptical and critical. They ranged from
- political objections to the purely practical impediments. Some argued
- that this process of requiring the government to have the key to all
- conversations was a violation of the fourth amendment of the
- constitution prohibiting warrentless searches. Others noted that a
- software solution was much simpler and cheaper even if the chips were
- going to cost a moderate $25. There were many different objections,
- but practically everyone felt that a standard security system was
- preferable.
-
- This meeting was largely devoted to the rebutals from the
- government. The National Security Association, the Department of
- Justice, the FBI, the national association of District Attorneys
- and Sheriffs and several others were all testifying today.
-
- The board itself runs with a quasi-legal style they make a point of
- making both video and audio tapes of the presentations. The entire
- discussion is conducted with almost as much gravity as Congressional
- hearings. The entire meeting was suffused with an air of ernest
- lawfullness that came these speakers. All of them came from the upper
- ranks of the military or legal system and a person doesn't rise to
- such a position without adopting the careful air of the very diligent
- bureaucrat. People were fond of saying things like, "Oh, it's in the
- Federal Register. You can look it up." This is standard operating
- procedure in Washington agencies and second nature to many of the
- day's speakers.
-
- Dorothy Denning was one of the first speakers and she reported on
- the findings of the committee of five noted public cryptologists
- who agreed to give the Clipper standard a once-over. Eleven people
- were asked, but six declined for a variety of reasons. The review
- was to be classified "Secret" and some balked at this condition
- because they felt it would compromise their position in public.
-
- The talk made clear that the government intended to keep the
- standard secret for the sole purpose of preventing people from
- making unauthorized implementations without the law enforcement
- back door. Dr. Denning said that everyone at the NSA believes
- that the algorithm could withstand public knowledge with no trouble.
- The review by the panel revealed no reason why they shouldn't trust
- this assessment.
-
- Although lack of time lead the panel to largely rubberstamp
- the more extensive review by the NSA, they did conduct a few tests
- of their own. They programmed the algorithm on a Cray YMP, which
- incidentally could process 89,000 encryptions per second in single
- processor mode. This implementation was used for a cycling test which
- they found seemed to imply that there was good randomness. The test
- is done by repeatedly encrypting one value of data until a cycle occurs.
- The results agreed with what a random process should generate.
-
- They also tested the system for strength against a differential
- cryptanalysis attack and found it worthy. There was really very
- little other technical details in the talk. Saying more would
- have divulged something about the algorithm.
-
- My general impression is that the system is secure. Many people
- have played paranoid and expressed concerns that the classified
- algorithm might be hiding a trapdoor. It became clear to me that
- these concerns were really silly. There is a built-in trapdoor
- to be used by the government when it is "legal authorized" to
- intercept messages. The NSA has rarely had trouble in the past
- exercising either its explicitly granted legal authority or
- its implied authority. The phrase "national security" is a
- powerful pass phrase around Washington and there is no reason
- for me to believe that the NSA wouldn't get all of the access
- to the escrow database that it needs to do its job. Building in
- a backdoor would only leave a weakness for an opponent to exploit
- and that is something that is almost as sacrilidgeous at the NSA
- as just putting the classified secrets in a Fed Ex package to
- Saddam Hussein.
-
- Next there was a report from Geoff Greiveldinger , the man from the
- Department of Justice with the responsibility of implementing the the
- Key Escrow plan. After the Clipper/Capstone/SkipJack chips are
- manufactured, they will be programmed with an individual id number and
- a secret, unique key. A list is made of the id, key pairs and this
- list is split into two halves by taking each unique key, k, and
- finding two numbers a and b such that a+b=k. (+ represents XOR). One
- new list will go to one of the escrow agencies and one will go to the
- other. It will be impossible to recover the secret key without getting
- the list entry from both agencies.
-
- At this point, they include an additional precaution. Each list
- will be encrypted so even the escrow agency won't be able to
- know what is in its list. The key for decoding this list will
- be locked away in the evesdropping box. When a wiretap is authorized,
- each escrow agency will lookup the halves of the key that correspond
- to the phone being tapped and send these to evesdropping box
- where they will be decrypted and combined. That means that
- two clerks from the escrow agencies could not combine their
- knowledge. They would need access to a third key or an evesdropping
- box.
-
- It became clear that the system was not fully designed. It wasn't
- obvious how spontenaeous and fully automated the system would
- be. Mr. Greiveldinger says that he is trying to balance the tradeoffs
- between security and efficiency. Officers are bound to be annoyed and
- hampered if they can't start a tap instanteneously. The kidnapping of
- a child is the prototypical example of when this would be necessary.
-
- The courts also grant authority for "roving" wiretaps that allow
- the police to intercept calls from any number of phones. A tap like
- this begs out for a highly automated system for delivering the
- keys.
-
- I imagine that the system as it's designed will consist of escrow
- computers with a few clerks who have nothing to do all day. When
- a tap is authorized, the evesdropping box will be programmed with
- a private key and shipped to the agents via overnight express. When
- they figure out the id number of the phone being tapped, the evesdropping
- box will probably phone the two escrow computers, perform a bit of
- zero-knowledge authorization and then receive the two halves of the
- key. This would allow them to switch lines and conduct roving
- taps effectively. The NSA would presumably have a box that would
- allow them to decrypt messages from foreign suspects.
-
- At this point, I had just listened to an entirely logical presentation
- from a perfect gentleman. We had just run though a system that had many
- nice technological checks and balances in it. Subverting it seemed
- very difficult. You would need access to the two escrow agencies and
- an evesdropping box. Mr. Greiveldinger said that there would be many
- different "auditting" records that would be kept of the taps. It was
- very easy to feel rather secure about the whole system in a nice,
- air-conditioned auditorium where clean, nice legally precise people
- were speaking in measured tones. It was very easy to believe in
- the Rule of Law.
-
- To counteract this, I tried to figure out the easiest way for me
- to subvert the system. The simplest way is to be a police officer
- engaged in a stakeout of someone for whom you've already received
- a warrant. You request the Clipper evesdropping box on the off chance
- that the suspect will buy a Clipper phone and then you "lend" it
- to a friend who needs it. I think that the automation will allow
- the person who possesses the box to listen in to whatever lines
- that they want. The escrow agency doesn't maintain a list of people
- and id numbers-- they only know the list matching the id number to
- the secret key. There is no way that they would know that a request
- from the field was unreasonable. Yes, the audit trails could be
- used later to reconstruct what the box was used for, but that would
- only be necessary if someone got caught.
-
- The bribe value of this box would probably be hard to determine,
- but it could be very valuable. We know that the government of France
- is widely suspected of using its key escrow system to evesdrop on
- US manufacturers in France. Would they be willing to buy evesdropping
- time here in America? It is not uncommon to see reports of industrial
- espionage where the spies get millions of dollars. On the other hand,
- cops on the beat in NYC have been influenced for much less. The
- supply and demand theory of economics virtually guarantees that
- some deals are going to be done.
-
- It is not really clear what real effect the key escrow system is going
- to have on security. Yes, theives would need to raid two different
- buildings and steal two different copies of the tapes. This is
- good. But it is still impossible to figure out if the requests from
- the field are legitimate-- at least within the time constraints posed
- by urgent cases involving terrorism and kidnapping.
-
- The net effect of implementing the system is that the phone system
- would be substantially strengthened against nieve intruders, but the
- police (and those that bribe them) would still be able to evesdrop
- with impunity. Everyone needs to begin to do a bit of calculus between
- the costs and benefits of this approach. On one hand, not letting the
- police intercept signals will let the crooks run free but on the other
- hand, the crooks are not about to use Clipper phones for their secrets
- if they know that they can be tapped.
-
- The most interesting speaker was the assistant director of the National
- Security Agency, Dr. Clint Brooks. He immediately admitted that the
- entire Clipper project was quite unusual because the Agency was not
- used to dealing with the open world. Speaking before a wide audience
- was strange for him and he admitted that producing a very low cost
- commercial competitive chip was also a new challenge for them.
-
- Never-the-less, I found him to be the deepest thinker at the conference.
- He readily admitted that the Clipper system isn't intended to catch
- any crooks. They'll just avoid the phones. It is just going to deny
- them access to the telecommunications system. They just won't be able
- to go into Radio Shack and buy a secure phone that comes off the line.
-
- It was apparent that he was somewhat skeptical of the Clipper's potential
- for success. He said at one point the possibilities in the system
- made it worth taking the chance that it would succeed. If it could capture
- a large fraction of the market then it could help many efforts of the
- law enforcement and intelligence community.
-
- When I listened, though, I began to worry about what is going to happen
- as we begin to see the eventual blurring of data and voice communications
- systems. Right now, people go to Radio Shack to buy a phone. It's the
- only way you can use the phone system. In the future, computers, networks
- and telephones are going to be linked in much more sophisticated ways.
- I think that Intel and Microsoft are already working on such a technology.
-
- WHen this happens, programmable phones are going to emerge. People
- will be able to pop a new ROM in their cellular digital phone or
- install new software in their computer/video game/telephone. This
- could easily be a proprietary encryption system that scrambles
- everything. The traditional way of controlling technology by
- controlling the capital intensive manufacturing sites will be gone. Sure,
- the NSA and the police will go to Radio Shack and say "We want your
- cooperation" and they'll get it. But it's the little, slippery ones
- that will be trouble in the new, software world.
-
- The end of the day was dominated by a panel of Law Enforcement specialists
- from around the country. These were sheriffs, district attorneys,
- FBI agents and other officers from different parts of the system.
- Their message was direct and they didn't hesitate to compare encryption
- with assault rifles. One even said, "I don't want to see the officers
- outgunned in a technical arena."
-
- They repeatedly stressed the incredible safe guards placed upon
- the wiretapping process and described the hurdles that the officers
- must go through to use the system. One DA from New Jersey said that
- in his office, they process about 10,000 cases a year, but they only
- do one to two wiretaps on average. It just seems like a big hassle
- and expense for them.
-
- It is common for the judges to require that the officers have very
- good circumstantial evidence from informers before giving them
- the warrant. This constraint coupled with the crooks natural hesitation
- to use the phone meant that wiretaps weren't the world's greatest
- evidence producers.
-
- One moment of levity came when a board member asked what the criminals
- favorite type of encryption was. The police refused to answer this one
- and I'm not that sure if they've encountered enough cases to build a
- profile.
-
- At the end of all of the earnestness and "support-the-cop-on-the-beat",
- I still began to wonder if there was much value to wiretaps at all. The
- police tried to use the low numbers of wiretaps as evidence that they're not
- out there abusing the system, but I kept thinking that this was mainly
- caused by the high cost and relatively low utility of the technique.
-
- It turns out that there is an easy way to check the utility of these
- devices. Only 37 states allow their state and local police to use
- wiretaps in investigations. One member of the panel repeated the rumor
- that this is supposedly because major politicians were caught with
- wiretaps. The state legislatures in these states supposedly
- realized that receipients of graft and influence peddlers were the main
- target of wiretaps. Evesdropping just wasn't a tool against muggers.
- So they decided to protect themselves.
-
- It would be possible to check the crime statistics from each of these
- states and compare them against the evesdropping states to discover
- which has a better record against crime. I would like to do this
- if I can dig up the list of states that allow the technique.
- I'm sure that this would prove little, but it could possibly clarify
- something about this technique.
-
- It is interesting to note that the House of Representative committee
- on the Judiciary was holding hearings on abuses of the National Crime
- Information Center. They came in the same week as the latest round
- of Clipper hearings before the CSAB. The NCIC is a large computer
- system run by the FBI to provide all the police departments with a
- way to track down the past records of people. The widespread access
- to the system makes it quite vulnerable to abuse.
-
- In the hearings, the Congress heard many examples of unauthorized
- access. Some were as benign as people checking out employees. The
- worst was an ex-police officer who used the system to track down his
- ex-girlfriend and kill her. They also heard of a woman who looked
- up clients for her drug-dealing boyfriend so he could avoid the
- undercover cops.
-
- These hearings made it obvious that there were going to be problems
- determining the balance of grief. For every prototypical example of
- a child kidnapped to make child pornography, there is a rengade
- police officer out to knock off his ex-girlfriend. On the whole, the
- police may be much more trustworthy than the criminals, but we need
- to ask how often a system like Clipper will aid the bad guys.
-
-
- In the end, I reduced the calculus of the decision about Clipper to be
- a simple tradeoff. If we allow widespread, secure encryption, will the
- criminals take great advantage of this system? The secure phones won't
- be useful in rapes and random street crime, but they'll be a big aid
- to organized endeavors. It would empower people to protect their own
- information unconditionally, but at the cost of letting the criminals
- do the same.
-
- Built-in back doors for the law enforcement community, on the other
- hand, will deny the power of off-the-shelf technology to crooks,
- but it would also leave everyone vulnerable to organized attacks
- on people.
-
- I began to wonder if the choice between Clipper and totally secure
- encryption was moot. In either case, there would be new opportunities
- for both the law-abiding and the law-ignoring. The amount of crime
- in the country would be limited only by the number of people who
- devote their life to the game-- not by any new fangled technology
- that would shift the balance.
-
-
- I did not attend the Friday meeting so someone else will need to summarize
- the details.
-
-
-
-
-